Cannot upload large file to Google Cloud Storage

烈酒焚心 提交于 2020-01-15 01:22:47

问题


It is okay when dealing with small files. It doesn't work only when I try to upload large files. I'm using Python client. The snippet is:

filename='my_csv.csv'
storage_client = storage.Client()
bucket_name = os.environ["GOOGLE_STORAGE_BUCKET"]
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob("{}".format(filename))
blob.upload_from_filename(filename)  # file size is 500 MB

The only thing I get as a Traceback is "Killed" and I'm out of python interpreter.

Any suggestions are highly appriciated

Edit: It works okay from local machine. My application runs in Google Container Engine, so problems occurs there when runs in celery task.


回答1:


Try uploading the file in chunks. You can find samples here. (search for request.next_chunk())




回答2:


upload_by_filename attempts to upload the entire file in a single request.

You can use Blob.chunk_size to spread the upload across many requests, each responsible for uploading one "chunk" of your file.

For example:

my_blob.chunk_size = 1024 * 1024 * 10



来源:https://stackoverflow.com/questions/45300037/cannot-upload-large-file-to-google-cloud-storage

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!