Google App Engine: How to write large files to Google Cloud Storage

后端 未结 3 1551
余生分开走
余生分开走 2021-01-03 03:45

I am trying to save large files from Google App Engine\'s Blobstore to Google Cloud Storage to facilitate backup.

It works fine for small files (<10 mb) but for l

相关标签:
3条回答
  • 2021-01-03 04:01

    I was having the same issue, endup writing an iterator around fetch data and catch the exception, works but is a work-around.

    Re-writing your code would be something like:

    from google.appengine.ext import blobstore
    from google.appengine.api import files
    
    def iter_blobstore(blob, fetch_size=524288):
      start_index = 0
      end_index = fetch_size
    
      while True:
        read = blobstore.fetch_data(blob, start_index, end_index)
    
        if read == "":
          break
    
        start_index += fetch_size
        end_index += fetch_size
    
        yield read
    
    
    PATH = '/gs/backupbucket/'
    for df in DocumentFile.all():           
      fn = df.blob.filename
      br = blobstore.BlobReader(df.blob)
      write_path = files.gs.create(self.PATH+fn.encode('utf-8'), mime_type='application/zip',acl='project-private') 
      with files.open(write_path, 'a') as fp:
        for buf in iter_blobstore(df.blob):
          try:
            fp.write(buf)
          except files.FileNotOpenedError:
            pass
      files.finalize(write_path)
    
    0 讨论(0)
  • 2021-01-03 04:01

    IMO you should'n files.finalize(write_path) on interval, finalize makes file readable and you cant't change it to writeable again.

    0 讨论(0)
  • 2021-01-03 04:08

    Is backends an option you can choose? That will run in background and has much greater power than TaskQueue.

    0 讨论(0)
提交回复
热议问题