I am experiencing a problem with a Django application that is exceeding the maximum number of simultaneous connections (100) to Postgres when running through Gunicorn
with async eventlet
workers. When the connection limit it reached the application starts returning 500
-errors until new connections can be established.
This is my database configuration:
DATABASES = {
'default': {
'ENGINE': 'django.contrib.gis.db.backends.postgis',
'NAME': 'django',
'USER': 'django',
'HOST': 'postgres',
'PORT': 5432,
'CONN_MAX_AGE': 60,
}
}
This is how Gunicorn is started:
gunicorn --bind 0.0.0.0:8080 --worker-class eventlet --workers 5 myapp.wsgi:application
These are the installed packages:
- djano v1.7
- gunicorn v19.3
- eventlet v0.17
- psycopg2 v2.6
Is Django not able to reuse connections across HTTP requests when running with Gunicorn workers? Is some kind of 3rd party database connection pool my only option here?
Update 15-03-23: There appears to be a problem with CONN_MAX_AGE
and async Gunicorn workers. Connections are indeed persistent but never reused in any sequential requests as noted in this post. Setting CONN_MAX_AGE
to 0
forces Django to close connections when the request ends, preventing unused persistent connections form laying around.
Django does no database connection pooling. Have a look at PgBouncer. It is a lightweight connection pooled that is easy to setup and configure: https://wiki.postgresql.org/wiki/PgBouncer
In short: Your django app connects to PgBouncer and it has a pool of connections to Postgres that it reuses, so the maximum connection limit is never exceeded.
来源:https://stackoverflow.com/questions/29170542/django-exceeds-maximum-postgres-connections