Hi, On 2019-11-01 12:22:06 -0400, Robert Haas wrote: > On Fri, Nov 1, 2019 at 12:00 PM Andres Freund <and...@anarazel.de> wrote: > > That seems like a bad idea - we add the cost multiple times. And we > > still want to compare plans that potentially involve that cost, if > > there's no other way to plan the query. > > Yeah. I kind of wonder if we shouldn't instead (a) skip adding paths > that use methods which are disabled and then (b) if we don't end up > with any paths for that reloptinfo, try again, ignoring disabling > GUCs.
Hm. That seems complicated. Is it clear that we'd always notice that we have no plan early enough to know which paths to reconsider? I think there's cases where that'd only happen a few levels up. As a first step I'd be inclined to "just" adjust disable_cost up to something like 1.0e12. Unfortunately much higher and and we're getting into the area where the loss of precision starts to be significant enough that I'm not sure that we're always careful enough to perform math in the right order (e.g. 1.0e16 + 1 being 1.0e16, and 1e+20 + 1000 being 1e+20). I've seen queries with costs above 1e10 where that costing wasn't insane. And then, in a larger patch, go for something like Heikki's proposal quoted by Zhenghua Lyu upthread, where we treat 'forbidden' as a separate factor in comparisons of path costs, rather than fudging the cost upwards. But there's some care to be taken to make sure we don't regress performance too much due to the additional logic in compare_path_cost et al. I'd also be curious to see if there's some other problem with cost calculation here - some of the quoted final costs seem high enough to be suspicious. I'd be curious to see a plan... Greetings, Andres Freund