Jim C. Nasby wrote:
On Wed, Dec 21, 2005 at 05:43:38PM -0500, Bruce Momjian wrote:

Rick Gigger wrote:

It seems to me like there are two classes of problems here:

1) Simply invalidating plans made with out of date statistics.
2) Using run-time collected data to update the plan to something more intelligent.

It also seems like #1 would be fairly straightforward and simple whereas #2 would be much more complex. #1 would do me a world of good and probably other people as well. Postgres's query planning has always been fine for me, or at least I have always been able to optimize my queries when I've got a representative data set to work with. Query plan caching only gets me when the query plan is created before the statistics are present to create a good plan.

Just one users 2 cents.

Agreed.  I just can't add #2 unless we get more agreement from the
group, because it has been a disputed issue in the past.


Well, how about this, since it's a prerequisit for #2 and would be
generally useful anyway:

Track normal resource consumption (ie: tuples read) for planned queries
and record parameter values that result in drastically different
resource consumption.

This would at least make it easy for admins to identify prepared queries
that have a highly variable execution cost.

Yeah, it seems such a log would be very helpful in its own right for DBA's and also as a feedback loop to find possibles issues in the query planner. And maybe one day this feedback loop can be even directly used by the server itself.

regards,
Lukas Smith

---------------------------(end of broadcast)---------------------------
TIP 3: Have you checked our extensive FAQ?

              http://www.postgresql.org/docs/faq

Reply via email to