I\'d like to see which queries are being executed on a live Django application, and how much memory they are taking up. I have read that pg_stat_activity
can be use
See this closely related answer.
pg_stat_activity
is a view in the pg_catalog
schema.
You can query it by SELECT
ing from it like any other table, eg SELECT * FROM pg_stat_activity
. The manual page you linked to explains its columns.
You'll sometimes find yourself wanting to join on other tables like pg_class
(tables), pg_namespace
(schemas), etc.
pg_stat_activity
does not expose information about back-end memory use. You need to use operating-system level facilities for that. It does tell you the process ID, active user, currently running query, activity status, time the last query started, etc. It's good for identifying long-running idle in transaction
sessions, very long running queries, etc.
Frankly, PostgreSQL's built-in monitoring is rather rudimentary. It's one of the areas that's not that exciting to work on, and commercial clients aren't often willing to fund it. Most people couple tools like check_postgres with Icinga and Munin, or use Zabbix or other external monitoring agents.
In your case it sounds like you really want pg_stat_statements
, and/or PgBadger log analysis with suitable logging settings and possibly the auto_explain
module.