On Saturday, 16 November 2013 at 23:34:55 UTC, Jonathan M Davis wrote:
If you want to use the type system to try and protect against dereferencing null, then having wrapper which guarantees that the object _isn't_ null makes a lot more sense, particularly when just because you used Optional<T> instead of T mkaes no guarantees whatsoever that all of the other T's in the program are non-null. At best, if Optional<T> is used 100% consistently, you know that
when a naked T is null, it's a bug.

You're right, it's better to ensure that the object is not null in the first place, which is what languages like Haskell, Spec#, Kotlin, and Rust do. D currently doesn't do this, and most developers probably won't have the discipline to use NonNull consistently throughout their code. The best we can do on that front is make sure it's used consistently within Phobos, so we can guarantee that we'll never give a user a null value.

Honestly, I pretty much never have problems with null pointers/references, and my natural reaction when I see people freaking out about them is to think that they don't know what they're doing or that they're just plain paranoid. That
doesn't mean that my natural reaction is right.

I think in this case, your natural reaction is wrong, because you've used mostly languages with nullable references. It's a case of the blub fallacy: "Nullable references are good enough. Why bother with all that hairy non-nullable stuff?"

It could easily be the case
that many such people are merely programming in environments different enough from anything I've had to deal with that null is actually a real problem for them and that it would be a real problem for me in the same situation. But in my experience, null really isn't a problem, and it can be very useful. So, when people freak out about it and insist on trying to get the type system to protect them from it, it really baffles me. It feels like they're trying to take a very useful tool out of the toolbox just because they weren't careful and
managed to scratch themselves with it once or twice.

I don't think anyone's freaking out about null, and you're right that null is useful. The question is, why do we need object references to be nullable by default? If they were non-nullable by default, we could eliminate a whole class of errors for free. Not for some arcane definition of free. This is a free lunch that is being refused. You seem to be asking the question "why do we need them", when you should be asking "what do we lose by not having them".

Note that I'm arguing for non-nullable references here, which D is obviously never going to have. The next best thing is, as you suggested, having a wrapper type that we can use to be reasonably sure never holds a null reference. Again, the problem with that is that it requires programmer discipline.

And Java's Optional seems even more useless, because it doesn't actually protect you against dereferencing null, and because it doesn't prevent
anything which isn't in an Optional from being null.

See, that's the problem. References are nullable by default in Java, so even with an Optional type and a NonNullable wrapper you can never be 100% that you're not dealing with null masquerading as an object. The truly safe thing would be to enforce in the language that all references are wrapped in Optional by the compiler, or make a language change to disallow null references, but doing either of those is not at all realistic. Still, creating a convention of avoiding objects that aren't wrapped in Optional among Java developers could get you pretty close.

Much as I don't think that it's worth it, I can at least see arguments for using NonNullable (which will end up in std.typecons eventually) to guarantee that the object isn't null, but I really don't think that using Optional or Nullable on a nullable type gains you anything except the illusion of
protection.

Well, again, Optional would force you to check that the underlying object was null before you used it. You simply can't call, say, calculatePrice() on a Nullable!SalesGood (well, you actually can due to the fact that Nullable aliases itself to the wrapped object, which is a huge mistake IMO).

Oh, well. null seems to be a very divisive topic. There are plenty of folks who are convinced that it's a huge disaster, and plenty of others who have no problems with it at all and consider it to be useful. And for some reason, it seems like the folks in Java land freak out over it a lot more than the folks in C++ land, and aside from D, C++ is definitely the language that I've used the most and am most comfortable with, as well as tend to agree with the proponents of the most (though obviously, it has plenty of flaws - hence why I
prefer D).

I think "huge disaster" might be a mischaracterization on your part. There is no worldwide hysteria over nullable references, just a growing realization that we've been doing it wrong for the past 20 years. And yes, null is useful to indicate the absence of a value, but objects don't have to be nullable by default for you to use null. Many languages make the programmer ask for a nullable reference specifically by appending ? to the type, which makes everyone reading the code aware that the reference you have might be null, and to take appropriate care.

Reply via email to