On Monday, 24 November 2014 at 15:56:44 UTC, Andrei Alexandrescu
wrote:
On 11/24/14 4:54 AM, Don wrote:
In D, 1u - 2u > 0u. This is defined behaviour, not an
overflow.
I think I get what you mean, but overflow is also defined
behavior (in D at least). -- Andrei
Aargh! You're right. That's new, and dreadful. It didn't used to
be.
The offending commit is
alexrp 2012-05-15 15:37:24
which only provides an unsigned example.
Why are defining behaviour that is always a bug? Java makes it
defined, but it has to because it doesn't have unsigned types.
I think the intention probably was to improve on the C situation,
where there is undefined behaviour that really should be defined.
But do we really want to preclude ever having overflow checking
for integers?