google-cloud-storage

Can't upload large files to Python + Flask in GCP App Engine

只愿长相守 提交于 2021-01-28 02:07:11
问题 UPDATE: (5/18/2020) Solution at the end of this post! I'm attempting to upload big CSV files (30MB - 2GB) from a browser to GCP App Engine running Python 3.7 + Flask, and then push those files to GCP Storage. This works fine on local testing with large files, but errors out immediately on GCP with a " 413 - Your client issued a request that was too large " if the file is larger than roughly 20MB. This error happens instantly on upload before it even reaches my custom Python logic (I suspect

Using Cloudflare CDN + HTTPS with Google Cloud Storage

我与影子孤独终老i 提交于 2021-01-27 23:30:22
问题 I’m trying to figure out how to get my Google Cloud Storage bucket to work with Cloudflare. I followed the steps here https://cloud.google.com/storage/docs/static-website and did the following: Added a cname record for where I want to serve my content: cdn.test.stellarguard.me -> c.storage.googleapis.com Added a cloud storage bucket for cdn.test.stellarguard.me Uploaded a file and made it public: https://storage.googleapis.com/cdn.test.stellarguard.me/logo.svg However, when I go to https:/

Is there a way to make a whole Google Storage Bucket have preset metadata?

天涯浪子 提交于 2021-01-27 18:41:38
问题 I would like to set one of my storage buckets to automatically assign certain metadata to every item that is newly updated (disposition, attachment). How do I do this, is there a way? 回答1: As far as my knowledge goes there isn't a way to do that. I've been thinking on a workaround and I came up with a possible solution. Use a Cloud Function that triggers when a file it's uploaded to a bucket [1], then POST to the API [2] [3] to edit the metadata of that file. This way, every file added to the

How do you make many files private in Google Cloud Storage?

只谈情不闲聊 提交于 2021-01-27 18:30:29
问题 I have researched a lot and unable to come up with a solution for this. Here is the code i am using to make all the files public in GCP: def make_blob_public(bucket_name, blob_name): """Makes a blob publicly accessible.""" storage_client = storage.Client() bucket = storage_client.get_bucket(bucket_name) blob = bucket.blob(blob_name) blob.make_public() The above method works but when i write blob.make_private() to make all files private i get the error: AttributeError: 'Blob' object has no

Accessing files from Google cloud storage in RStudio

佐手、 提交于 2021-01-27 14:37:06
问题 I have been trying to create connection between the Google cloud storage and RStudio server(The one I spinned up in Google cloud), so that I can access the files in R to run sum analysis on. I have found three different ways to do it on the web, but I don't see many clarity around these ways so far. Access the file by using the public URL specific to the file [This is not an option for me] Mount the Google cloud storage as a disc in RStudio server and access it like any other files in the

Google Cloud Storage and being charged for files not found

那年仲夏 提交于 2021-01-27 14:32:17
问题 Does anyone know if you are charged for a file request in Google Cloud Storage if the file doesn't exist? In other words, does someone accessing a non-existent file in your bucket count against your requests? Or is that only for files that exists? 回答1: Customers are not charged for requests that result in a 400-level or 500-level HTTP response. The only exception is for 404 responses returned for buckets that have Website Configuration enabled with a custom NotFoundPage object. 来源: https:/

Google Takeout from G Suite Download from Google Cloud Storage

梦想与她 提交于 2021-01-27 14:11:12
问题 I am a G Suite admin for a nonprofit, and just discovered the Data Export feature, which seems to be like an individual account's Takeout. The export files were prepared, and are now available to download from a bucket in Google Cloud Platform Storage. However, there are many, many folders and trying to go in and out of each one to download the many, many .zip files in each sounds like a major headache to track. I use Transmit on my Mac, and it has the ability to connect to Google Cloud

Google Takeout from G Suite Download from Google Cloud Storage

主宰稳场 提交于 2021-01-27 13:59:42
问题 I am a G Suite admin for a nonprofit, and just discovered the Data Export feature, which seems to be like an individual account's Takeout. The export files were prepared, and are now available to download from a bucket in Google Cloud Platform Storage. However, there are many, many folders and trying to go in and out of each one to download the many, many .zip files in each sounds like a major headache to track. I use Transmit on my Mac, and it has the ability to connect to Google Cloud

How to properly handle files upload using Node.js Express backend?

我与影子孤独终老i 提交于 2021-01-27 13:57:13
问题 I decided to use ng-flow, an Angular implementation of flow.js at front end to handle files uploading, I then picked multer as middleware to receive the files. I did the most simple middleware setup for multer: app.use(multer({ dest: './temp_uploads/'})) Got a /POST upload route and I'm now logging to console what's being received: app.route('/upload').post(function(request,response,next){ console.log(request.body) console.log(request.files) // Response code and stuff then ... }); So the

Unable to install google cloud sdk: “<gcloud.components.update> Failed to fetch component” Windows7

谁都会走 提交于 2021-01-27 13:40:44
问题 When I try to intall google cloud sdk the following error occurs: ERROR: (gcloud.components.update) Failed to fetch component listing from server. Check your network settings and try again. Google Cloud SDK installer will now exit. Press any key to continue . . . result = func(*args) File "C:\python27_x64\lib\urllib2.py", line 1222, in https_open return self.do_open(httplib.HTTPSConnection, req) File "C:\python27_x64\lib\urllib2.py", line 1184, in do_open raise URLError(err) urllib2.URLError: