On Tue, Sep 30, 2014 at 8:34 AM, Graeme B. Bell <g...@skogoglandskap.no> wrote:
>
>>> The existing cost estimation
>>> code effectively assumes that they're perfectly uniformly distributed;
>>> which is a good average-case assumption but can be horribly wrong in
>>> the worst case.
>
>
> Sorry, just an outsider jumping in with a quick comment.
>
> Every year or two the core count goes up. Can/should/does postgres ever 
> attempt two strategies in parallel, in cases where strategy A is generally 
> good but strategy B prevents bad worst case behaviour? Kind of like a 
> Schrödinger's Cat approach to scheduling.

> What problems would it raise?

Interleaved I/O, that would kill performance for both plans if it
happens on rotating media.


-- 
Sent via pgsql-performance mailing list (pgsql-performance@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-performance

Reply via email to