On Thu, Jul 9, 2009 at 5:38 AM, Noah Misch<n...@leadboat.com> wrote:
z
> Describing in those terms illuminates much.  While the concepts do suggest 2^N
> worst-case planning cost, my artificial test case showed a rigid 4^N pattern;
> what could explain that?
>

Isn`t that just so that the planner has to examine O(2^N) subsets of
relations, and do O(2^N) work for each of them? To create level N join
the planner chooses pairs of level k and level N-k joins. the count of
level k joins is O(2^k), the count of level N-k ones is O(2^(N-k)).
Together it is O(N) * O(2^N) * O(2^k) * O(2^(N-k))  which is O(N* 4^N)
.


This is for the worst case. If we could make a better estimate of the
required planning time (I believe that the input data for a good
heuristic is a matrix which says which relation is constrained to
which relation), we could make better decisions about when to flatten
subqueries, collapse joins, launch geqo...

Greetings
Marcin

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to