What is the fastest way to rebuild PostgreSQL statistics from zero/scratch with ANALYZE?

こ雲淡風輕ζ 提交于 2019-12-13 03:46:26

问题


I have a PostgreSQL v10 database with a size of about 100GB.

What is the most efficient (fastest) way to rebuild statistics, for example after a major version upgrade?

ANALYZE with no parameters updates statistics for then entire database by default — it's painfully slow! This seems like a single process.

Is there any way to parallelize this to speed it up?


回答1:


You could use vacuumdb with the same options that pg_upgrade suggests:

vacuumdb --all --analyze-in-stages

The documentation describes what it does:

Only calculate statistics for use by the optimizer (no vacuum), like --analyze-only. Run several (currently three) stages of analyze with different configuration settings, to produce usable statistics faster.

This option is useful to analyze a database that was newly populated from a restored dump or by pg_upgrade. This option will try to create some statistics as fast as possible, to make the database usable, and then produce full statistics in the subsequent stages.

To calculate statistics with several parallel processes, you can use the option -j of vacuumdb.



来源:https://stackoverflow.com/questions/52958570/what-is-the-fastest-way-to-rebuild-postgresql-statistics-from-zero-scratch-with

易学教程内所有资源均来自网络或用户发布的内容,如有违反法律规定的内容欢迎反馈
该文章没有解决你所遇到的问题?点击提问,说说你的问题,让更多的人一起探讨吧!