Hello I looked on your module now and I am can't to say, what is better - both modules has little bit different target if I understand, and I don't know if they can be mixed. There are two strategies - rich contrib module or a few functions in core. Probably we can't to have a too rich API for record type like regexp has - transform result to table, to array, read values to array, to table, read tuple desc to array, to table, ... there should be one strategy - both strategies (based on array, based on tables) are legal. I prefer a table based solution, because I don't need a one "unnest", but other preferences are valid too. I dissatisfied with your design of explicit target type via unused value. I think, so we are not a infrastructure for it now - from my view is better to use a common type, that is text now. It's nothing new - plpgsql use it too.
I see one well design of explicit target type based on polymorphic types that respect a PostgreSQL fmgr practice: We have to allow a polymorphic functions without polymorphic parameters. These functions shoud be designed to return value in "unknown" type format when this function has not outer information. This information can be passed in function context. When function context isn't null, then function has to read target type and should to return value in target type. Who can fill a function context? It is task for executor. And when CAST contains just function call, then we can recheck, if function is polymorphic, and if it is, then we can set function context to target type, and then we don't need to call a conversion function, because polymorphic function must returns data in correct format. So when polymorphic function is used without polymorphic parameters and without explicit CAST, then returns unknown. Without described functionality we can design a not polymorphic function, that can returns unknown type. When similar functionality will be implemented, then this function will be changed to polymorphic, but from user's perspective, there isn't a change. Regards Pavel Stehule 2010/12/11 Florian Pflug <f...@phlo.org>: > On Dec11, 2010, at 17:55 , Pavel Stehule wrote: >> It same trick, that I use in record_set_fields. > Oh, I see. Sorry, must haven missed that when I read your blog entry. > >> But I don't want to >> use it for reading of value. I don't like it. You don't need to know a >> value, you have to know a type - NULL::type. it is just not nice :). > Well, no, it's not the most elegant API on earth, that's for sure. But I my > opinion, it isn't so bad that it rectifies casting everything to text. > >> I >> though about it too, and maybe is a time for new polymorphic type >> "anytype" - and then you don't need to write a litte bit strange >> NULL::type >> >> it can be "fieldvalue(myrec, type1, false)" > Hm, I don't think the ability to write just "type1" instead of "NULL::type1" > is worth the necessary effort. > > If anything, I'd allow function to use anyelement as their return value > *without* having any polymorphic arguments. You'd need to surround calls to > such function with a CAST() expression, unless something else determines the > type. In pl/pgsql, for example, one could allow the CAST() to be skipped for > assignment to variables. Then, you could write > v_value := fieldvalue(myrec, 'f1', true) > > That'd at least free you from having to specify the type in some cases. But > still, even this seems to be a lot of effort for quite little gain... > > best regards, > Florian Pflug > > -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers