Analysing/Profiling queries on PostgreSQL

后端 未结 2 1792
有刺的猬
有刺的猬 2020-12-31 12:34

I\'ve just inherited an old PostgreSQL installation and need to do some diagnostics to find out why this database is running slow. On MS SQL you would use a tool such as Pro

相关标签:
2条回答
  • 2020-12-31 12:47

    My general approach is usually a mixture of approaches. This requires no extensions.

    1. set the log_min_duration_statement to catch long-running queries. https://dba.stackexchange.com/questions/62842/log-min-duration-statement-setting-is-ignored should get you started.

    2. Use profiling of client applications to see which queries they are spending their time on. Sometimes one has queries which take a small duration but are so frequently repeated to cause performance problems.

    Of course then explain analyze can help. If you are looking inside plpgsql functions however, often you need to pull out the queries and run explain analyze on them directly.

    Note: ALWAYS run explain analyze in a transaction that rolls back or a read-only transaction unless you know that it does not write to the database.

    0 讨论(0)
  • 2020-12-31 12:59

    Use pg_stat_statements extension to get long running queries. then use select* from pg_stat_statements order by total_time/calls desc limit 10 to get ten longest. then use explain to see the plan...

    0 讨论(0)
提交回复
热议问题