On 2/1/14, 2:14 AM, Jonathan M Davis wrote:
On Saturday, February 01, 2014 04:01:50 deadalnix wrote:
Dereferencing it is unsafe unless you put runtime check.

How is it unsafe? It will segfault and kill your program, not corrupt memory.
It can't even read any memory. It's a bug to dereference a null pointer or
reference, but it's not unsafe, because it can't access _any_ memory, let
alone memory that it's not supposed to be accessing, which is precisely what
@safe is all about.

This has been discussed to death a number of times. A field access obj.field will use addressing with a constant offset. If that offset is larger than the lowest address allowed to the application, unsafety may occur.

The amount of low-address memory protected is OS-dependent. 4KB can virtually always be counted on. For fields placed beyond than that limit, a runtime test must be inserted. There are few enough 4KB objects out there to make this practically a non-issue. But the checks must be there.

  Which is stupid for something that can be verified at compile time.

In the general case, you can only catch it at compile time if you disallow it
completely, which is unnecessarily restrictive. Sure, some basic cases can be
caught, but unless the code where the pointer/reference is defined is right
next to the code where it's dereferenced, there's no way for the compiler to
have any clue whether it's null or not. And yes, there's certainly code where
it would make sense to use non-nullable references or pointers, because
there's no need for them to be nullable, and having them be non-nullable
avoids any risk of forgetting to initialize them, but that doesn't mean that
nullable pointers and references aren't useful or that you can catch all
instances of a null pointer or reference being dereferenced at compile time.

The Java community has a good experience with @Nullable: http://stackoverflow.com/questions/14076296/nullable-annotation-usage


Andrei

Reply via email to