Tom Lane wrote:
Would a simple constant value be workable, or do we need some more
complex model (and if so what)?
Consider:
ANALYZE myfunc(integer) ON (SELECT myfunc(7)) WITH RATIO 0.03;
ANALYZE myfunc(text,text) ON (SELECT myfunc(mt.a,mt.b) FROM mytable mt) WITH
RATIO 1.071;
ANALYZE
Mark Dilger [EMAIL PROTECTED] writes:
Tom Lane wrote:
Would a simple constant value be workable, or do we need some more
complex model (and if so what)?
Consider:
ANALYZE myfunc(integer) ON (SELECT myfunc(7)) WITH RATIO 0.03;
...
It seems to me that the above system would work perfectly
Tom Lane wrote:
Mark Dilger [EMAIL PROTECTED] writes:
Tom Lane wrote:
Would a simple constant value be workable, or do we need some more
complex model (and if so what)?
Consider:
ANALYZE myfunc(integer) ON (SELECT myfunc(7)) WITH RATIO 0.03;
...
It seems to me that the above system would
I complained about how:
The query is
SELECT p1.opcname, p1.opcfamily
FROM pg_opclass AS p1
WHERE NOT EXISTS(SELECT 1 FROM pg_amop AS p2
WHERE p2.amopfamily = p1.opcfamily
AND binary_coercible(p1.opcintype, p2.amoplefttype));
and investigation showed
Neil Conway [EMAIL PROTECTED] writes:
On Mon, 2007-01-15 at 15:05 -0500, Tom Lane wrote:
maybe we should just do the constant for starters and see how many
people really want to write C-code estimators ...
+1
It seemed like that was the list's consensus, so I'll go off and do the
Tom Lane wrote:
BTW, I'm thinking that a cost constant probably ought to be measured
in units of cpu_operator_cost. The default for built-in functions would
thus be 1, at least till such time as someone wants to refine the
estimates. We'd probably want the default for PL and SQL functions
Ron Mayer [EMAIL PROTECTED] writes:
Tom Lane wrote:
BTW, I'm thinking that a cost constant probably ought to be measured
in units of cpu_operator_cost.
Any chance that costs could eventually change to real-world units?
Define real world units.
If you like you can try to adjust things so
On Mon, 2007-01-15 at 13:54 -0500, Neil Conway wrote:
On Mon, 2007-01-15 at 10:51 -0800, Richard Troy wrote:
I therefore propose that the engine evaluate -
benchmark, if you will - all functions as they are ingested, or
vacuum-like at some later date (when valid data for testing may exist),
Jeff Davis [EMAIL PROTECTED] writes:
Would any form of cost estimate have meaning if the function has side
effects? If it's a volatile function, doesn't that mean that the planner
can't avoid or favor executing it?
No, not really. If the function is down inside a sub-select or
something like
On Mon, 2007-01-15 at 15:05 -0500, Tom Lane wrote:
Brian Hurt [EMAIL PROTECTED] writes:
Non-developer here, but we use a lot of plpgsql functions at work. And
the functions we use fall into two broad, ill-defined catagories-
expensive functions and cheap functions. What I'd like as a
Tom Lane wrote:
Instead, I'm thinking it might be time to re-introduce some notion of
function execution cost into the system, and make use of that info to
sort WHERE clauses into a reasonable execution order.
I imagine you've thought of this already but just in case, the cost of the
Gregory Stark [EMAIL PROTECTED] writes:
I imagine you've thought of this already but just in case, the cost of the
function call has to be combined with the selectivity to get this right. If
you can do an expensive but very selective clause first and save 100 cheap
calls that almost always
Tom Lane [EMAIL PROTECTED] writes:
Gregory Stark [EMAIL PROTECTED] writes:
I imagine you've thought of this already but just in case, the cost of the
function call has to be combined with the selectivity to get this right. If
you can do an expensive but very selective clause first and save
So I've been working on the scheme I suggested a few days ago of
representing equivalence classes of variables explicitly, and avoiding
the current ad-hocery of generating and then removing redundant clauses
in favor of generating only the ones we want in the first place. Any
clause that looks
Tom Lane wrote:
Instead, I'm thinking it might be time to re-introduce some notion of
function execution cost into the system, and make use of that info to
sort WHERE clauses into a reasonable execution order.
That sounds like a straightforward idea.
This example
would be fixed with even a
Heikki Linnakangas [EMAIL PROTECTED] writes:
Tom Lane wrote:
Would a simple constant value be workable, or do we need some more
complex model (and if so what)?
A simple constant would probably be enough. If we want anything fancier
than that, it should be up to the author of the function to
Tom Lane wrote:
Heikki Linnakangas [EMAIL PROTECTED] writes:
A simple constant would probably be enough. If we want anything fancier
than that, it should be up to the author of the function to define the
cost model as well. I'm envisioning that you could attach a custom cost
function to a
Heikki Linnakangas [EMAIL PROTECTED] writes:
Tom Lane wrote:
If we go this route it seems like we'll need four more columns in
pg_proc (cost estimation function OID, rowcount estimation function OID,
fallback cost constant, fallback rowcount constant).
What would the fallbacks be for?
By
On Mon, 15 Jan 2007, Tom Lane wrote:
So I've been working on the scheme I suggested a few days ago of
representing equivalence classes of variables explicitly, and avoiding
the current ad-hocery of generating and then removing redundant clauses
in favor of generating only the ones we want in
On Mon, 2007-01-15 at 10:51 -0800, Richard Troy wrote:
I therefore propose that the engine evaluate -
benchmark, if you will - all functions as they are ingested, or
vacuum-like at some later date (when valid data for testing may exist),
and assign a cost relative to what it already knows -
On Mon, 15 Jan 2007, Neil Conway wrote:
On Mon, 2007-01-15 at 10:51 -0800, Richard Troy wrote:
I therefore propose that the engine evaluate -
benchmark, if you will - all functions as they are ingested, or
vacuum-like at some later date (when valid data for testing may exist),
and assign
Neil Conway wrote:
On Mon, 2007-01-15 at 10:51 -0800, Richard Troy wrote:
I therefore propose that the engine evaluate -
benchmark, if you will - all functions as they are ingested, or
vacuum-like at some later date (when valid data for testing may exist),
and assign a cost relative to what
Brian Hurt [EMAIL PROTECTED] writes:
Non-developer here, but we use a lot of plpgsql functions at work. And
the functions we use fall into two broad, ill-defined catagories-
expensive functions and cheap functions. What I'd like as a user is
some way to tell the planner this function is
On Mon, 2007-01-15 at 15:05 -0500, Tom Lane wrote:
maybe we should just do the constant for starters and see how many
people really want to write C-code estimators ...
+1
BTW, your proposal would still pushdown all qualifiers, right?
Hellerstein's xfunc work discusses situations in which it
Neil Conway [EMAIL PROTECTED] writes:
BTW, your proposal would still pushdown all qualifiers, right?
Yeah, I have no intention of readopting xfunc in the near future ...
especially seeing that it's possible for the user to force that
sort of thing if he really has to.
SELECT * FROM
25 matches
Mail list logo