bearophile wrote:
Three good blog posts about undefined behaviour in C and C++: http://blog.regehr.org/archives/213 http://blog.regehr.org/archives/226 http://blog.regehr.org/archives/232

In those posts (and elsewhere) the expert author gives several good bites to
the ass of most compiler writers.

Among other things in those three posts he talks about two programs as:

import std.c.stdio: printf; void main() { printf("%d\n", -int.min); }

import std.stdio: writeln; void main() { enum int N = (1L).sizeof * 8; auto
max = (1L << (N - 1)) - 1; writeln(max); }

I believe that D can't be considered a step forward in system language
programming until it gives a much more serious consideration for
integer-related overflows (and integer-related undefined behaviour).

The good thing is that Java is a living example that even if you remove most
integer-related undefined behaviours your Java code is still able to run as
fast as C and sometimes faster (on normal desktops).

You're conflating two different things here - undefined behavior and behavior on overflow. The Java spec says that integer overflow is ignored, for example.

In C++, overflow behavior is undefined because C++ still supports ones-complement arithmetic. Java and D specify integer arithmetic to be 2's complement. Java defined left shift to, not surprisingly, match what the x86 CPU does. This, of course, will conveniently not result in any penalty on the x86 for conforming to the spec.

Reply via email to