j...@commandprompt.com ("Joshua D. Drake") writes: > On the other hand ANALYZE also: > > 1. Uses lots of memory > 2. Lots of processor > 3. Can take a long time > > We normally don't notice because most sets won't incur a penalty. We got a > customer who > has a single table that is over 1TB in size... We notice. Granted that is > the extreme > but it would only take a quarter of that size (which is common) to start > seeing issues.
I find it curious that ANALYZE *would* take a long time to run. After all, its sampling strategy means that, barring having SET STATISTICS to some ghastly high number, it shouldn't need to do materially more work to analyze a 1TB table than is required to analyze a 1GB table. With the out-of-the-box (which may have changed without my notice ;-)) default of 10 bars in the histogram, it should search for 30K rows, which, while not "free," doesn't get enormously more expensive as tables grow. -- "cbbrowne","@","gmail.com" http://linuxfinances.info/info/linuxdistributions.html Rules of the Evil Overlord #179. "I will not outsource core functions." <http://www.eviloverlord.com/> -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers