David Gould <da...@sonic.net> writes:
> I'm confused at this point, I provided a patch that addresses this and a
> test case. We seem to be discussing everything as if we first noticed the
> issue. Have you reviewed the patch and and attached analysis and tested it?
> Please commment on that?

I've looked at the patch.  The questions in my mind are

(1) do we really want to go over to treating ANALYZE's tuple density
result as gospel, contradicting the entire thrust of the 2011 discussion?

(2) what should we do in the VACUUM case?  Alexander's argument seems
to apply with just as much force to the VACUUM case, so either you
discount that or you conclude that VACUUM needs adjustment too.

> This tables reltuples is 18 times the actual row count. It will never converge
> because with 50000953 pages analyze can only adjust reltuples by 0.0006 each 
> time.

But by the same token, analyze only looked at 0.0006 of the pages.  It's
nice that for you, that's enough to get a robust estimate of the density
everywhere; but I have a nasty feeling that that won't hold good for
everybody.  The whole motivation of the 2011 discussion, and the issue
that is also seen in some other nearby discussions, is that the density
can vary wildly.

                        regards, tom lane

Reply via email to