问题
To upload a file from url to a cloud storage system, usually it is required to download that file on a server, and then upload it to the cloud storage.
For large files, it may be required to write the file on disk instead of memory. Since app engine does not support writing on disks, are there any other options for doing this on app engine?
I understand that managed vm is an option, but I'm trying to make sure that it's definitely not possible to do this on classic app engine.
回答1:
To overcome the lack of local disk on GAE you can:
- use the blobstore
- better yet, use directly the cloud storage: Upload images/video to google cloud storage using Google App Engine, Sending images to google cloud storage using google app engine.
To download the files to GAE you could use the URL service. But there are 2 limitations to keep an eye on:
- the file download duration causing DeadlineExceededErrors in url fetching, you could bump it to 10 min on background tasks, see App Engine Python UrlFetch.set_default_fetch_deadline
- the max URL fetch response size of 32M, for which the Sockets service appears to be a workaround (paid apps only): see GAE - urlfetch multipart post not working with large files
If the server offering the downloads supports multi-part downloads it might be possible to get a solution working for any file sizes with this info.
Note: this is just theoretical, I've only thought about this, I didn't actually tried it.
回答2:
Yes, it's possible. You can generate an upload URL (using Cloud Storage API) and provide it to a client that should use it in a POST request.
Here is docs for PHP but this approach should be working for Python as well.
Here is an example: https://github.com/GoogleCloudPlatform/storage-signedurls-python
来源:https://stackoverflow.com/questions/32972477/is-it-possible-to-upload-a-file-by-url-to-cloud-storage-on-app-engine-without-wr