Geert Bosch wrote:

This is completely wrong. Making operations undefined is a two-edged
sword. At the one hand, you can make more assumptions, but there's
also the issue that when you want to rewrite expressions, you have
to be more careful to not introduce undefined behavior when there
was none before.

No, I think you miss the point

The canonical example is addition of signed integers. This operation
is associative with -wrapv, but not without.

So
   a = b + C1 + c + C2;
could be written as
   a = b + c + (C1 + C2);
where the constant addition is performed at compile time.
With signed addition overflowing, you can't do any reassociation,
because this might introduce overflows where none existed before.

Yes, but the overflows are harmless given that we know the
code we can generate will actuallly wrap, so it is just fine
to reassociate freely without -fwrapv, since the result will
be the same. Overflow is not an issue. If the final result
has overflowed, then the original program is for sure undefined.
If the final result has not overflowed, then intermediate values
may or may not have overflowed. There are two cases, either the
original canonical order caused overflow, in which case giving
the right result for the overall calculation is fine (though
not required), or it did not, in which case giving the right
result is also fine (and required)

Sometimes I think it is far better to have a default of -fwrapv for
at -O1 and possibly -Os. Sure, this would disable some powerful
optimizations, especially those involving loops, but it would in
practise be very useful to get reasonably good optimization for programs
with minimizing the number of programs with undefined behavior.
Also, it would allow some new optimizations, so total loss of
performance may be quite acceptable.

This is not right, on a machine where in fact addition wraps,
-fwrapv can never enable optimizations that would otherwise not
be possible. I must say when I first read this claim, I had
exactly the same initial reaction as Geert, but then I thought
about it more and realized the claim is indeed correct.

Also, for safety-critical program and certification, it is essential
to be able to reason about program behavior. Limiting the set of
programs with erroneous or undefined execution is essential.

I don't see this at all, you have to prove that you don't have overflows
in any case if the semantics of your program does not expect overflows.
This is what you have to do in any Ada program. I don't see that
enabling -fwrapv makes such proofs easier or harder.

If you want to prove that a program doesn't cause undefined behavior,
it is very helpful signed integer overflow to be defined, even if
it's just implementation defined. That would be a huge selling-point
for GCC.

I don't see this

   -Geert

Reply via email to