问题
I am using django-channels
to add HTTP2
& WebSocket
support for my application. I could not find a lot of documentation as to how to scale channels. Below is my nginx
configuration that load balances multiple instances of daphne
running on the same machine but different ports. Is this the correct way to do it?
upstream socket {
least_conn;
server 127.0.0.1:9000;
server 127.0.0.1:9001;
server 127.0.0.1:9002;
server 127.0.0.1:9003;
}
server {
listen 80;
server_name 127.0.0.1;
location = /favicon.ico { access_log off; log_not_found off; }
location /static/ {
root /home/niscp/home-screen;
}
location /nicons/ {
root /home/niscp/home-screen;
}
location / {
include uwsgi_params;
uwsgi_pass unix:/home/niscp/home-screen/home-screen.sock;
}
location /ws/ {
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection "upgrade";
proxy_pass http://socket;
}
}
Along with that, I am running individual instances of workers
to listen to individual channels in the following manner:
python manage.py runworker --only-channels=websocket.connect
python manage.py runworker --only-channels=websocket.receive
python manage.py runworker --only-channels=websocket.disconnect
I have got uwsgi
to handle all http requests the way django
normally handles them. All daphne
and workers
do is handle WebSocket
requests.
Is this a viable method to scale django-channels
, or is there something I could do better?
回答1:
There are a few things here. To start, I don't think you're going to see much gain with running different types of requests in different processes. Your disconnect handlers are probably going to be very light - not doing much besides cleanup. Connect might not do much either and receive will get most of the load.
You're betting off using the --threads parameter and starting multiple threads. Your current setup would only run one thread for each type of handler.
The way runworker works is that it communicates with Daphne over your channel layer (ex Redis). All of the workers are listening to a queue. When a request comes in one worker will process it. While that worker is processing the request the other workers will wait for subsequent requests and process them. Once they send their response they go back to listening to the queue. If there are no --only-channels specified, each process will be pulling off requests and working on them as fast as it can and none of them will be waiting around.
Its up to you to find the best balance of threads/workers by running multiple processes and the --threads parameter. You can also have workers reserved for heavy channels so they don't bring down your site.
Having multiple Daphne instances will help. But since all they do is send messages between your server and the workers you might not see the benefit of running 4 of them.
Everything stated here is not appliacable to Channels 2. This is for the old version of Django Channels.
来源:https://stackoverflow.com/questions/46764574/running-multiple-instances-of-daphne-behind-a-load-balancer-django-channels