2010/12/12 Florian Pflug <f...@phlo.org>:
> On Dec12, 2010, at 00:19 , Pavel Stehule wrote:
>> I prefer a table based
>> solution, because I don't need a one "unnest", but other preferences
>> are valid too.
> That's fine with me.
>
>> I dissatisfied with your design of explicit target type
>> via unused value.  I think, so we are not a infrastructure for it now
>> - from my view is better to use a common type, that is text now. It's
>> nothing new - plpgsql use it too.
> Sorry, I can't follow you here. Where does plpgsql use text as "common" type?
>
>> I see one well design of explicit target type based on polymorphic
>> types that respect a PostgreSQL fmgr practice:
>>
>> We have to allow a polymorphic functions without polymorphic
>> parameters. These functions shoud be designed to return value in
>> "unknown" type format when this function has not outer information.
> I don't think "unknown" is the right type for that. As far as I known, 
> "unknown" is still a textual type, used to have some type to assign to string 
> literals during parsing when no better type can be inferred.
>
>> This information can be passed in function context. When function
>> context isn't null, then function has to read target type and should
>> to return value in target type. Who can fill a function context? It is
>> task for executor. And when CAST contains just function call, then we
>> can recheck, if function is polymorphic, and if it is, then we can set
>> function context to target type, and then we don't need to call a
>> conversion function, because polymorphic function must returns data in
>> correct format.
> The main difficulty is that currently types are assigned in a bottom-up 
> fashion as far as I know. To make functions with a polymorphic return value, 
> but without polymorphic arguments work, you need to assign the return type in 
> a top-down fashion (It depends on where to value *goes*, not where it *comes 
> from*). That seems like a rather huge change and has the potential to 
> complicate quite a few other parts, most notably function lookup/resolution.
>
> Plus, the general case where type information must bubble up more than one 
> level seems pretty much intractable, as it'd require a full-blown type 
> inference algorithm like ML or Haskell. Not a place where we want to go, I 
> believe.
>
> The restricted case, on the other hand, brings very little benefit compared 
> to the dummy-parameter approach. Yeah, "<polymorphic function>()::type" may 
> look a bit cleaner than "<polymorphic function>(NULL::type)", but thats about 
> is. It's only assignments in pl/pgsql which really benefit, since you'd be 
> able to leave out the type completely, writing simply "v_value := 
> <polymorphic_function>()". Does that really warrant the effort that'd be 
> involved?

There is a second possibility - and hardly simpler. We can use a
specialised statement with own parser/executor node. Then
implementation should be really simply

syntax:

EXTRACT_VALUE(expr1 FROM expr2 AS typename) ... RETURNS typename

expr1 ... result must be converted to text .. fieldname
expr2 ... result must be composite type

disadvantage - EXTRACT_VALUE must be a keyword
advantage - simple implementation, available for all environments, readable

var := EXTRACT_VALUE('f1' FROM myrec AS int);

note: name for this statement isn't important now, can be EXTRACT_FIELD, ...

comments, ideas?

Regards

Pavel Stehule

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to