问题
I'm trying to send large files (50MB-2GB) that I have stored in S3 using filepicker.io to the Google App Engine Blobstore (in Python). I would like this to happen without going through a form on the client browser as it defeats the project requirements (and often hangs with very large files).
I tried multiple solutions, including:
- trying to load the file into GAE with urlfetch (but GAE has a 32MB limit for requests/responses)
- constructing a multi-part form in python and sending it to
blobstore.create_upload_url()
(can't transfer the file via url, and can't load it in the script because of the 32MB limit) - using boto to read the file straight into the blobstore (connection times out and logs show
encountered HTTPException exception
from boto that triggersCancelledError: The API call logservice.Flush() was explicitly cancelled.
from GAE that crashes the process.
I am struggling to find a working solution. Any hints on how I could perform this transfer, or how to pass the file from s3 as a form attachment without loading it in python first (ie. just specifying its url) would be very much appreciated.
回答1:
The BlobstoreUploadHandler isn't constrained by a 32MB limit: https://developers.google.com/appengine/docs/python/tools/webapp/blobstorehandlers. However, I'm not sure how this might fit into your app. If you can post the file to an endpoint handled by a BlobstoreUploadHandler, then you should be good to go.
来源:https://stackoverflow.com/questions/15970514/send-large-files-to-gae-blobstore-via-api-from-s3