Scenario: there are multiple folders and many files stored in storage bucket that is accessible by project team members. Instead of downloading individual files one at a time (w
To download files to local machine need to:
install gsutil to local machine
run Google Cloud SDK Shell
run the command like this (example, for Windows-platform):
gsutil -m cp -r gs://source_folder_path "%userprofile%/Downloads"
I would suggest downloading the files with gsutil
. However if you have a large number of files to transfer you might want to use the gsutil -m
option, to perform a parallel (multi-threaded/multi-processing) copy:
gsutil -m cp -R gs://your-bucket .
The time reduction for downloading the files can be quite significant. See this Cloud Storage documentation for complete information on the GCS cp
command.
If you want to copy into a particular directory, note that the directory must exist first, as
gsutils
won't create it automatically. (e.g:mkdir my-bucket-local-copy && gsutil -m cp -r gs://your-bucket my-bucket-local-copy
)
I recommend they use gsutil. GCS's API deals with only one object at a time. However, its command-line utility, gsutil
, is more than happy to download a bunch of objects in parallel, though. Downloading an entire GCS "folder" with gsutil is pretty simple:
$> gsutil cp -r gs://my-bucket/remoteDirectory localDirectory