问题
It is okay when dealing with small files. It doesn't work only when I try to upload large files. I'm using Python client. The snippet is:
filename='my_csv.csv'
storage_client = storage.Client()
bucket_name = os.environ["GOOGLE_STORAGE_BUCKET"]
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob("{}".format(filename))
blob.upload_from_filename(filename) # file size is 500 MB
The only thing I get as a Traceback is "Killed" and I'm out of python interpreter.
Any suggestions are highly appriciated
Edit: It works okay from local machine. My application runs in Google Container Engine, so problems occurs there when runs in celery task.
回答1:
Try uploading the file in chunks. You can find samples here. (search for request.next_chunk()
)
回答2:
upload_by_filename
attempts to upload the entire file in a single request.
You can use Blob.chunk_size
to spread the upload across many requests, each responsible for uploading one "chunk" of your file.
For example:
my_blob.chunk_size = 1024 * 1024 * 10
来源:https://stackoverflow.com/questions/45300037/cannot-upload-large-file-to-google-cloud-storage