Now, let's look back at the alternative, where we keep the flexibility of the 
null label, but treat patterns as meaning what they mean, and letting switch 
decide to throw based on whether there is a nullable pattern or not.  So a 
switch with a total type pattern -- that is, `var x` or `Object x` -- will 
accept null, and thread it into the total case (which also must be the last 
case.)
To me this is the material point, and has been all along:
There is never a need to perform an O(N) visual scan of
a switch to see if it accepts nulls, since the users simply
have to inspect the final (and perhaps initial) case of the
the switch.  Good style will avoid puzzlers such as final
cases which are difficult to classify (total vs. partial).
The language does not have to carry the burden of
enforcing full annotation.

I think the concern on the part of the null-fearers is not so much the O(n) scan (though that's a concern), as much as the subtle difference between `case Object o` and `default`.  They are almost identical, except in null.  No one likes having to carry that distinction around in their head.

The essential point here is, though: that subtle distinction has to live somewhere; we don't get to banish it, we just get to try to put it where it does the least damage.  And the place that seems least damaging in the short run, is likely more damaging in the long run.

(I am starting to think I should rethink Serialization's position as my "most regretted feature", given the degree to which null distorts every language evolution discussion....)

Reply via email to