running django worker and daphne in docker container

独自空忆成欢 提交于 2019-12-05 07:20:11

问题


I have django application that run in docker container. Recently i figured out that i'm going to need to add websockets interface to my application. I'm using channels with daphne behind nginx and redis as a cache. The problem is that i have to run django workers and daphne in 1 container. Script that is running on container startup:

#!/usr/bin/env bash

python wait_for_postgres.py
python manage.py makemigrations
python manage.py migrate
python manage.py collectstatic --no-input

python manage.py runworker --only-channels=http.* --only-channels=websocket.* -v2
daphne team_up.asgi:channel_layer --port 8000 -b 0.0.0.0

But it hangs on running a worker. I tried nohup but it seems to not work. If i run daphne directly from container with docker exec everything works just fine.


回答1:


This is an old question, but I figured I will answer it anyway, because I recently faced the same issue and thought I can shed some light on this.

How Django channels work

Django Channels is another layer on top of Django and it has two process types:

  • One that accepts HTTP/Websockets
  • One that runs Django views, Websocket handlers, background tasks, etc

Basically, when a request comes in, it first hits the interface server (Daphne), which accepts the HTTP/Websocket connection and puts it on the Redis queue. The worker (consumer) then sees it, takes it off the queue and runs the view logic (e.g. Django views, WS handlers, etc).

Why it didn't work for you

Because you only run the worker (consumer) and it's blocking the execution of the interface server (producer). Meaning, that no connections will be accepted and worker is just staring at an empty redis queue.

How I made it work

I run Daphne, redis and workers as separate containers for easy scaling. DB migrations, static file collection, etc are executed only in Daphne container. This container will only have one instance running to ensure that there are no parallel db migrations running.

Workers on the other hand can be scaled up and down to deal with the incoming traffic.

How you could make it work

Split your setup into at least two containers. I wouldn't recommend running everything in one container (using Supervisor for example). Why? Because when the time comes to scale the setup there's no easy way to do it. You could scale your container to two instances, but that just creates another supervisor with daphne, redis, django in it... if you split the worker from daphne, you could easily scale the worker container to deal with growing incoming requests.

One container could run:

#!/usr/bin/env bash

python wait_for_postgres.py
python manage.py migrate
python manage.py collectstatic --no-input

daphne team_up.asgi:channel_layer --port 8000 -b 0.0.0.0

while the other one:

#!/usr/bin/env bash

python wait_for_postgres.py
python manage.py runworker --only-channels=http.* --only-channels=websocket.* -v2

The 'makemigrations' command

There is no need to run the command in the script you provided, if anything it could block the whole thing because of some question it is awaiting input for (e.g. "Did you rename column X to Y?").

Instead, you can execute it in a running container like this:

docker exec -it <container_name> python manage.py makemigrations


来源:https://stackoverflow.com/questions/37905539/running-django-worker-and-daphne-in-docker-container

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!