The Monday 28 February 2011 13:57:45, Heikki Linnakangas wrote : > On 28.02.2011 11:38, Marc Cousin wrote: > > I've been facing a very large (more than 15 seconds) planning time in a > > partitioned configuration. The amount of partitions wasn't completely > > crazy, around 500, not in the thousands. The problem was that there were > > nearly 1000 columns in the parent table (very special use case, there is > > a reason for this application for having these many columns). The check > > constraint was extremely simple (for each child, 1 column = 1 constant, > > always the same column). > > > > As I was surprised by this very large planning time, I have been trying > > to study the variation of planning time against several parameters: - > > number of columns > > - number of children tables > > - constraint exclusion's value (partition or off) > > > > What (I think) I measured is that the planning time seems to be O(n^2) > > for the number of columns, and O(n^2) for the number of children tables. > > > > Constraint exclusion had a limited impact on planning time (it added > > between 20% and 100% planning time when there were many columns). > > Testing here with a table with 1000 columns and 100 partitions, about > 80% of the planning time is looking up the statistics on attribute > width, to calculate average tuple width. I don't see O(n^2) behavior, > though, it seems linear.
It is only based on experimentation, for my part, of course⦠If you measure the planning time, modifying either the columns or the partitions number, the square root of the planning time is almost perfectly proportional with the parameter you're playing with.