django 1.7 and connection pooling to PostgreSQL?

爷,独闯天下 提交于 2019-11-30 08:37:29

Postgres database connections are expensive (resources) compared to MySQL connections. Django pooling apps will open many connections and keep the open.

PG Bouncer and PG Pool will open fewer connections to Postgres, while maintaining a large number of local connections (app to PG Bouncer/PG Pool) and reuse them.

For best performance you want both: persistent connections from Django to PG Pool / PG Bouncer.

In our case switching persistent connections reduced average response time by 10ms (over 20%) on AWS.

@c2h5oh has a great answer above. I would like to add one thing concerning the Django 1.6 update. I believe what you and the article's author are referring to is the CONN_MAX_AGE setting.

I found this question because I was searching for the same thing myself, so I'm not sure about the following, but allow me to hypothesize:

You should be able to use all three tools together:

  1. CONN_MAX_AGE (django persistent connections)
  2. django-postgrespool (pooled connections to PgBouncer)
  3. PgBouncer (pooled connections to db)

I know that #2 and #3 play nicely as evidenced by Heroku's article on connection pooling, but I'm not sure about how #1 and #2 interact.

I'm guessing the savings of using #1 and #2 together is pretty slim. Django-postgrespool is essentially designed to save connection time, but your requests still have to connect to those connections, so CONN_MAX_AGE would be saving you an aggregate of very small connection times. In addition, if you're using Heroku, CONN_MAX_AGE could possibly interfere with automatic dyno restarts (just a guess).

Note that if you're using a web server like Gunicorn, you may need to make your workers synchronous in order to prevent a connection leak.

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!