google-cloud-storage

InvocationTargetException storing files with appengine-gcs-client-0.5 dev_appserver

為{幸葍}努か 提交于 2020-01-13 19:16:05
问题 I'm using appengine-gcs-client-0.5 and seeing InvocationTargetExceptions in my dev_appserver when calling GcsService.createOrReplace and GcsOutputChannel.close. It seems like the call to storeBlob does not have the appropriate permission, as the appserver gets an AccessControlException in com.google.appengine.api.blobstore.dev.FileBlobStorage.storeBlob: java.security.AccessControlException: access denied ("java.io.FilePermission" "/tmp/1440435923000-0/encoded_gs_key:<some key>" "write") What

What are the pros and cons of loading data directly into Google BigQuery vs going through Cloud Storage first?

匆匆过客 提交于 2020-01-13 19:13:06
问题 Also, is there anything wrong with doing transforms/joins directly within BigQuery? I'd like to minimize the number of components and steps involved for a data warehouse I'm setting up (simple transaction and inventory data for a chain of retail stores.) 回答1: Loading data via Cloud Storage is the fastest (and the cheapest) way. Loading directly can be done via app (using streaming insert which add some additional cost) For the doing transformation - if what are you plan/need to do can be done

Writing data to google cloud storage using python

北慕城南 提交于 2020-01-13 11:59:27
问题 I cannot find a way to to write a data set from my local machine into the google cloud storage using python. I have researched a a lot but didn't find any clue regarding this. Need help, thanks 回答1: Quick example, using the google-cloud Python library: from google.cloud import storage def upload_blob(bucket_name, source_file_name, destination_blob_name): """Uploads a file to the bucket.""" storage_client = storage.Client() bucket = storage_client.get_bucket(bucket_name) blob = bucket.blob

Can I upload files to google cloud storage from url?

隐身守侯 提交于 2020-01-13 07:59:31
问题 Code Exemple: gsutil cp "http://www.exemple.com/file.txt" "gs://bucket/" 回答1: You can stream the output of curl to the gsutil cp command as follows: curl http://www.example.com/file.txt | gsutil cp - gs://bucket/file.txt 回答2: There is Cloud Storage Transfer Service with an option to upload list of URLs though not very simple and more batch-oriented. 来源: https://stackoverflow.com/questions/18107545/can-i-upload-files-to-google-cloud-storage-from-url

Slow GCS upload speeds when using HTTP/2

拜拜、爱过 提交于 2020-01-13 06:05:53
问题 Context: We are uploading files directly to GCS from a web-interface via AJAX calls using a parallel composite upload approach. While running tests in different scenarios we noticed that on some networks the upload speed is capped around 50Mbps even though on all of them the bandwidth is between 100Mbps and 1Gbps. We ran gsutils perfdiag inside one of the "troubled" networks in order to emulate the web-interface upload approach and got significantly better performances. When comparing the

Should I move my static resources from App Engine to Google Cloud Storage?

て烟熏妆下的殇ゞ 提交于 2020-01-12 14:14:34
问题 We have a web application in App Engine. I was wondering whether it is a good idea to move my static resources (i.e Images, CSS files, and JS files) out from App Engine and serve them from Google Cloud Storage. My thinking here is two-fold: 1) We can get the advantages of a CDN with Google Cloud Storage. We can even configure metadata for each file to set expiration headers, gzip compression, etc. Also, by serving files from different domains we can have browsers download more content in

How do I use pandas.read_csv on Google Cloud ML?

风流意气都作罢 提交于 2020-01-12 10:53:08
问题 I'm trying to deploy a training script on Google Cloud ML. Of course, I've uploaded my datasets (CSV files) in a bucket on GCS. I used to import my data with read_csv from pandas, but it doesn't seem to work with a GCS path. How should I proceed (I would like to keep using pandas) ? import pandas as pd data = pd.read_csv("gs://bucket/folder/file.csv") output : ERROR 2018-02-01 18:43:34 +0100 master-replica-0 IOError: File gs://bucket/folder/file.csv does not exist 回答1: You will require to use

Set metadata in Google Cloud Storage using gcloud-python

北城余情 提交于 2020-01-12 08:13:29
问题 I am trying to upload a file to Google Cloud Storage using gcloud-python and set some custom metadata properties. To try this I have created a simple script. import os from gcloud import storage client = storage.Client('super secret app id') bucket = client.get_bucket('super secret bucket name') blob = bucket.get_blob('kirby.png') blob.metadata = blob.metadata or {} blob.metadata['Color'] = 'Pink' with open(os.path.expanduser('~/Pictures/kirby.png'), 'rb') as img_data: blob.upload_from_file

How to disable google cloud storage bucket list from acl control?

只愿长相守 提交于 2020-01-11 03:16:06
问题 We're using google cloud storage as our CDN. However, any visitors can list all files by typing: http://ourcdn.storage.googleapis.com/ How to disable it while all the files under the bucket is still public readable by default? We previously set the acl using gsutil defacl ch -g AllUsers:READ 回答1: Your defacl looks good. The problem is most likely that for some reason AllUsers must also have READ, WRITE, or FULL_CONTROL on the bucket itself. You can clear those with a command like this: gsutil

Upload files to Google cloud storage from appengine app

旧时模样 提交于 2020-01-10 19:38:12
问题 I'm sure the answer to this question is easy, but for me it's proven to be very frustrating since I can't put any solution I've found into practical code for my own use. I'm building an app on the app engine that let's the user upload a file, which then gets acted on by the app. The size of the file is typically around a few Mbs, and in some cases upto maybe 20 Mb or so. This is enough to trigger the 30 second timeout in app engine, and I am therefore trying to upload to cloud storage