PostgreSQL Index Usage Analysis

前端 未结 9 1637
伪装坚强ぢ
伪装坚强ぢ 2021-01-29 17:43

Is there a tool or method to analyze Postgres, and determine what missing indexes should be created, and which unused indexes should be removed? I have a little experience doing

9条回答
  •  醉梦人生
    2021-01-29 18:01

    There are multiple links to scripts that will help you find unused indexes at the PostgreSQL wiki. The basic technique is to look at pg_stat_user_indexes and look for ones where idx_scan, the count of how many times that index has been used to answer queries, is zero, or at least very low. If the application has changed and a formerly used index probably isn't now, you sometimes have to run pg_stat_reset() to get all the statistics back to 0 and then collect new data; you might save the current values for everything and compute a delta instead to figure that out.

    There isn't any good tools available yet to suggest missing indexes. One approach is to log the queries you're running and analyze which ones are taking a long time to run using a query log analysis tool like pgFouine or pqa. See "Logging Difficult Queries" for more info.

    The other approach is to look at pg_stat_user_tables and look for tables that have large numbers of sequential scans against them, where seq_tup_fetch is large. When an index is used the idx_fetch_tup count is increased instead. That can clue you into when a table is not indexed well enough to answer queries against it.

    Actually figuring out which columns you should then index on? That usually leads back to the query log analysis stuff again.

提交回复
热议问题