Asynchronous File Upload to Amazon S3 with Django

后端 未结 7 1173
轻奢々
轻奢々 2021-01-30 04:18

I am using this file storage engine to store files to Amazon S3 when they are uploaded:

http://code.welldev.org/django-storages/wiki/Home

It takes quite a long t

7条回答
  •  离开以前
    2021-01-30 04:57

    There is an app for that :-)

    https://github.com/jezdez/django-queued-storage

    It does exactly what you need - and much more, because you can set any "local" storage and any "remote" storage. This app will store your file in fast "local" storage (for example MogileFS storage) and then using Celery (django-celery), will attempt asynchronous uploading to the "remote" storage.

    Few remarks:

    1. The tricky thing is - you can setup it to copy&upload, or to upload&delete strategy, that will delete local file once it is uploaded.

    2. Second tricky thing - it will serve file from "local" storage until it is not uploaded.

    3. It also can be configured to make number of retries on uploads failures.

    Installation & usage is also very simple and straightforward:

    pip install django-queued-storage
    

    append to INSTALLED_APPS:

    INSTALLED_APPS += ('queued_storage',)
    

    in models.py:

    from queued_storage.backends import QueuedStorage
    queued_s3storage = QueuedStorage(
        'django.core.files.storage.FileSystemStorage',
        'storages.backends.s3boto.S3BotoStorage', task='queued_storage.tasks.TransferAndDelete')
    
    class MyModel(models.Model):
        my_file = models.FileField(upload_to='files', storage=queued_s3storage)
    

提交回复
热议问题