> Seems like we ought to clean this up.  I'm not sure which way to jump
> though: should we decree that arrays of no elements must always have
> zero dimensions, or should we get rid of that and standardize on, say,
> 1-D array with lower bound 1 and upper bound 0?

Isn't the zero-dimensional array vaguely polymorphic?  If '{}'::int[]
means a one-dimensional array, how do I create an empty
two-dimensional array onto which I can concatenate one-dimensional
arrays that are all of the same length?  I don't necessarily object to
changing this, even if it breaks backward-compatibility, but there
should be SOME easy way to do it.

> A somewhat related issue that I noticed while poking at this is that
> array_dims() returns NULL for a zero-dimension array.  That seems a bit
> bogus too; wouldn't an empty string be saner?  Of course the issue
> goes away if we get rid of zero-dimension arrays.

Most all of the existing array functions contain identical checks for
ndims out of range (and 0 is considered out of range) and return NULL
in that case.  This behavior doesn't appear to make a great deal of
sense to me in general.  If these functions can be called with an
object that's not an array, then the check is not nearly strong enough
to prevent chaos; if they can't, the check is unnecessary.  But maybe
I'm missing something?

In any event, the correct behavior for all of these functions on a
0-dimensional array should probably be reviewed, unless we eliminate
0-dimensional arrays.

...Robert

-- 
Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org)
To make changes to your subscription:
http://www.postgresql.org/mailpref/pgsql-hackers

Reply via email to