On 01.11.2011 06:33, Jeff Davis wrote:
On Mon, 2011-10-24 at 15:05 +0400, Alexander Korotkov wrote:

I think implementing subtype_diff for each datatype is ok.  We could
implement some universal function based on minus operator and casting
to double precision. But such solution might be unacceptable in
both predictability (operator and casting function might do not the
things we expect) and performance.

Done.

Everything is complete in this patch with the exception of two optional
things, which I still intend to do but might best be done in a separate
commit:

   * support typmod for ranges
   * support casts between different range types

Both of these things, I believe, require the introduction of an
RangeCoerceExpr, similar to ArrayCoerceExpr. That's fine, but it creates
a rather large diff, so it might be best left for a later commit.

Using the test table from the rangetypes test case:

postgres=#  select * from test_range_gist where 10 <@ ir;
ERROR:  unsupported type: 3904

This seems to be coming from the selectivity estimation function. The selectivity function for <@ is scalargtsel, which is usually used for scalar > and >=. That doesn't seem right. But what do we store in the statistics for range types in the first place, and what would be the right thing to do for selectivity estimation?

I'll dig deeper into this tomorrow...

--
  Heikki Linnakangas
  EnterpriseDB   http://www.enterprisedb.com

--
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to