On Sunday, 28 January 2018 at 19:17:49 UTC, Steven Schveighoffer wrote:
On 1/27/18 9:50 AM, ag0aep6g wrote:

Wow, that looks really bad.

Apparently, dmd implements `i < 0` as a `i >> 31`. I.e., it shifts the bits to the right so far that only the sign bit is left. This is ok.

But it implements `i > 0` as `(-i) >> 31`. That would be correct if negation would always flip the sign bit. But it doesn't for `int.min`. `-int.min` is `int.min` again.

So dmd emits wrong code for `i > 0`. O_O

I've filed an issue:
https://issues.dlang.org/show_bug.cgi?id=18315

This is insane. i > 0 is used in so many places. The only saving grace appears to be that int.min is just so uncommonly seen in the wild.

I tested all the way back to 2.040, still has the same behavior.

-Steve


FYI/OT: If you need to check the behavior of old compilers, the "All" option at run.dlang.io might be helpful.

It starts from 2.060 and it works best for consistent output (i.e. without stacktrace pointer).
Examples:

- https://run.dlang.io/is/IoN3sj (code from the bug report)
- https://run.dlang.io/is/LuxUQ5 (code from the bug report, slightly modified) - https://run.dlang.io/is/3R4r1U (simple example of "when was the symbol added to Phobos?")

(It's based on Vladimir's regression tester and can be used locally too: https://github.com/dlang-tour/core-dreg)

Reply via email to