Well, there might not be any technical limitation against having nulls
in the list, but as far as a design decision in the API, I can think
of reasons why you wouldn't want them. The most obvious is that by
making the guarantee that it has no null pointers, you don't have to
check for nulls (for example, when iterating and using the value). In
Java, this could obviously result in an exception at runtime if null
makes its way into your array and you aren't checking for it. Of
course, you could make the argument that you do seem to want it, and
the presence of NSKeyValueCoding.NullValue presents a similar set of
problems.
I would say that in Objective-C, the decision makes more sense because
nil is such an interesting animal. Since messages sent to a nil object
aren't an error, you might try to send a message to some nil element
of your array and get a zero back instead of some other return value
that you expected. The fact that this "fails" silently depending on
what your desired behavior is could certainly be cause for a headache.
This is a huge difference compared to what happens if you send a
message to NSNull. This code is valid, and won't cause any problem at
runtime:
NSLog(@"Sending a message to nil");
[nil doSomething];
NSLog(@"I sent a message to nil");
In fact, it won't do anything - the app will keep on truckin', and
you'll get the output "I sent a message to nil" (and whatever else
might follow). However, this code will raise an
NSInvalidArgumentException before reaching the second NSLog call:
NSLog(@"Sending a message to NSNull");
[[NSNull null] doSomething];
NSLog(@"I sent a message to nil");
So in Objective-C, I'd say that's an important distinction. I'm not
trying that hard to defend this as anything but a design decision -
but there's some points in favor of this behavior, at least as far as
Obj-C is concerned. There's also a lengthy discussion of some of the
other fun of 'nil' here: http://cocoawithlove.com/2008/06/doing-things-in-cocoa-with.html
Clark
On Dec 13, 2008, at 7:33 PM, Mike Schrag wrote:
For example, in WO, if you're using an NSDictionary "bindings" for
query bindings, you might want to query for some attribute
"myAttribute" that equals NULL in your database, so you would do
something like bindings.setObjectForKey(NSKeyValueCoding.NullValue,
"myAttribute"). So when EOF is constructing the qualifier, it knows
that it should produce "WHERE my_attribute = NULL", but that when
bindings.objectForKey("myOtherAttribute") is null (as in a null
pointer), it knows that's not an attribute you want to qualify on.
This is definitely a valid scenario, but it seems like more an
argument for the existence of NSNull (which I'm not really against,
only as a hack around the lack of null-in-collection support) and
less for why you can't add a real null as value. It doesn't seem
like those two capabilities would be mutually exclusive.
ms
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (Webobjects-dev@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/webobjects-dev/cpmueller%40mac.com
This email sent to cpmuel...@mac.com
_______________________________________________
Do not post admin requests to the list. They will be ignored.
Webobjects-dev mailing list (Webobjects-dev@lists.apple.com)
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/webobjects-dev/archive%40mail-archive.com
This email sent to arch...@mail-archive.com