问题
I am using Daphne for both socket and http connections. I am running 4 worker containers and running everything locally right now in a docker container.
My daphne server fails if I try to upload a file that is 400MB. It works fine for small files upto 15MB.
My docker container quits with error code 137. I dont get any error in daphne logs. The daphne container just dies but the worker containers keep on running.
Does anyone know if there is a way to increase upload limits on daphne or I am missing something else?
I start the daphne server by
daphne -b 0.0.0.0 -p 8001 project.asgi:channel_layer --access-log=${LOGS}/daphne.access.log
回答1:
This is because daphne loads the entire HTTP POST request body completely and immediately before transferring control to the django with channels.
All your 400 MB are loaded into RAM here. Your docker container died due to the out of memory reason.
This happens even before checking for the size of the request body in django. See here
There is the open ticket here
If you want to prevent it right now use a uvicorn instead daphne. Uvicorn must pass control to Django with chunks. And depending on the FILE_UPLOAD_MAX_MEMORY_SIZE
django setting you will receive a temporary file on your hard disk (not in RAM). But you need to write your own AsyncHttpConsumer
or AsgiHandler
because AsgiHandler
and AsgiRequest
from channels do not support chunked body too. This will be possible after the PR.
来源:https://stackoverflow.com/questions/46986220/daphne-django-file-upload-size-limitations