On Wed, 2003-01-29 at 14:53, Austin Hastings wrote:

> Leaving out the whole "is default()" bit, what happens when I:
> 
> my int @a;
> @a[4] = 100;
> @a[2];
> 
> What does @a[2] return? It must return something, and that something
> can't be undef, because ... <above> .... So, what is it? Whatever it
> is, that's the default.

That's an interesting question. At first, I was ready to say "zero, of
course", but the more I thought about it, the more I realized that an
array whose storage is not ultimately a collection of scalars can do one
of three things:

        * Initialize new array sections to <default>
        * Only default such arrays when elements are "off-the-end"
        * Not allow default on such arrays

the second one seems to be a cruel joke to play on a programmer. The
first is only slightly better. I'm beginning to go with the third...

However, there is always the idea of using the SQL-like null/not null
concept to allow simple types to be undefined. It makes sense to me the
programmer to constrain data to be integer type, but allow undefined
values. Even if there's no savings in terms of storage, I think it
should be allowed.

Perhaps I'm overreacting to the first option. It's not so bad. undef
should still probably keep its old semantics when being converted to an
integer and go to zero, though.


-- 
Aaron Sherman <[EMAIL PROTECTED]>
This message (c) 2003 by Aaron Sherman,
and granted to the Public Domain in 2023.
Fight the DMCA and copyright extension!


Reply via email to