gsutil

Set metadata for all objects in a Google Storage bucket

跟風遠走 提交于 2020-03-01 06:32:07
问题 I want to set Content-Type metadata to image/jpeg for all objects of a Google Storage bucket. How to do this? 回答1: Using gsutil and its setmeta command: gsutil -m setmeta -h "Content-Type:image/jpeg" gs://YOUR_BUCKET/**/*.jpg Use the -m to activate a parallel update, in case you have a lot of objects. The /**/* pattern will perform a recursive search on any folders that you may have on your bucket. 来源: https://stackoverflow.com/questions/52691451/set-metadata-for-all-objects-in-a-google

Google Cloud Storage ACL not working for bucket

与世无争的帅哥 提交于 2020-02-05 06:38:20
问题 I granted access to certain users by their emails on my bucket, hosted on Google Cloud Storage. (like... jane@gmail.com). however, whenever that person is logged in to their gmail account on chrome, they can't access the file. it just says permission denied. what's going on? the link i'm using is something like: http://storage.googleapis.com/my-bucket/my-object and on my dashboard, i've DEFINITELY configured their gmail accounts to be able to access my bucket (and also even specific files). i

Google Cloud Storage ACL not working for bucket

ⅰ亾dé卋堺 提交于 2020-02-05 06:37:05
问题 I granted access to certain users by their emails on my bucket, hosted on Google Cloud Storage. (like... jane@gmail.com). however, whenever that person is logged in to their gmail account on chrome, they can't access the file. it just says permission denied. what's going on? the link i'm using is something like: http://storage.googleapis.com/my-bucket/my-object and on my dashboard, i've DEFINITELY configured their gmail accounts to be able to access my bucket (and also even specific files). i

Gsutil uses a lot of memory when download multiple files with a lot of processes

99封情书 提交于 2020-01-06 05:57:28
问题 I need to download multiple files with gsutil and I notices that gsutil uses a lot of memory when downloading multiple files. (Around 1-2 GB ram when download three 2G files with 9 processes each). Is there a way to tune memory usage of gsutil? This is kind of important to me because I am running gsutil in GKE, and a container will get killed if use too much memory (more than limit) Another issue: it seems like gsutil can not download files with the same name in a single command (one will

Download public data to google storage

心已入冬 提交于 2020-01-03 06:45:15
问题 I want to download public data directly from the link to my google storage bucket using gsutil . I can't any command to do so. 回答1: You can use curl to get data from a link and pipe it to gsutil (which will stream it to Google Cloud Storage) like so: curl 'https://www.website.com/link-to-your-data' | gsutil cp - gs://your-bucket-name/your-object-name 来源: https://stackoverflow.com/questions/47778015/download-public-data-to-google-storage

Mass rename objects on Google Cloud Storage

廉价感情. 提交于 2020-01-02 11:07:31
问题 Is it possible to mass rename objects on Google Cloud Storage using gsutil (or some other tool)? I am trying to figure out a way to rename a bunch of images from *.JPG to *.jpg. 回答1: https://cloud.google.com/storage/docs/gsutil/addlhelp/WildcardNames gsutil supports URI wildcards EDIT gsutil 3.0 release note As part of the bucket sub-directory support we changed the * wildcard to match only up to directory boundaries, and introduced the new ** wildcard... Do you have directories under bucket?

Mass rename objects on Google Cloud Storage

℡╲_俬逩灬. 提交于 2020-01-02 11:07:25
问题 Is it possible to mass rename objects on Google Cloud Storage using gsutil (or some other tool)? I am trying to figure out a way to rename a bunch of images from *.JPG to *.jpg. 回答1: https://cloud.google.com/storage/docs/gsutil/addlhelp/WildcardNames gsutil supports URI wildcards EDIT gsutil 3.0 release note As part of the bucket sub-directory support we changed the * wildcard to match only up to directory boundaries, and introduced the new ** wildcard... Do you have directories under bucket?

Google cloud storage - Download file from web

若如初见. 提交于 2020-01-01 09:54:15
问题 I want to use Google cloud storage in my next project. My aim is tracking various web sites and collecting some photos. As, I read the documentation for gsutil; I'm able download the file manually to my server and upload it google cloud storage by using gsutil. Downloading and uploading files generates so much traffic in my server. Are there a way to let google cloud download file direct from http? 回答1: This is very easy to do from the Google Cloud Shell as long as your download is less than

I cannot upload large (> 2GB) files to the Google Cloud Storage web UI

[亡魂溺海] 提交于 2019-12-30 09:58:30
问题 I have been using the Google Cloud Storage Manager link on the Google APIs console in order to upload my files. This works great for most files: 1KB, 10KB, 1MB, 10MB, 100MB. However yesterday I could not upload a 3GB file. Any idea what is wrong? What is the best way to upload large files to Google Cloud Storage? 回答1: The web UI only supports uploads smaller than 2^32 bytes (4 GigaBytes). I believe this is a javascript limitation. If you need to transfer many or large files consider using

I cannot upload large (> 2GB) files to the Google Cloud Storage web UI

点点圈 提交于 2019-12-30 09:58:04
问题 I have been using the Google Cloud Storage Manager link on the Google APIs console in order to upload my files. This works great for most files: 1KB, 10KB, 1MB, 10MB, 100MB. However yesterday I could not upload a 3GB file. Any idea what is wrong? What is the best way to upload large files to Google Cloud Storage? 回答1: The web UI only supports uploads smaller than 2^32 bytes (4 GigaBytes). I believe this is a javascript limitation. If you need to transfer many or large files consider using