Joshua D. Drake wrote:
We normally don't notice because most sets won't incur a penalty. We got a customer who has a single table that is over 1TB in size... We notice. Granted that is the extreme but it would only take a quarter of that size (which is common) to start seeing issues.
Right, and the only thing that makes this case less painful is that you don't really need the stats to be updated quite as often in situations with that much data. If, say, your stats say there's 2B rows in the table but there's actually 2.5B, that's a big error, but unlikely to change the types of plans you get. Once there's millions of distinct values it's takes a big change for plans to shift, etc.
-- Greg Smith 2ndQuadrant Baltimore, MD PostgreSQL Training, Services and Support g...@2ndquadrant.com www.2ndQuadrant.com -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers