https://gcc.gnu.org/bugzilla/show_bug.cgi?id=111808

--- Comment #8 from Martin Uecker <muecker at gwdg dot de> ---

There are certainly other similar portability issues, e.g.:

enum : long { X = 0xFFFFFFFFUL };

https://godbolt.org/z/hKsqPe9c1

BTW: Are there better examples where we have similar build failures also in
pre-C2X? (not counting explicit compile-time tests for sizes or limits)   Most
simple C expressions do not seem to produce a hard error when switching between
64 and 32 bit archs, e.g. exceeding the range in an initializer of an enum does
not produce hard errors without -predantic-error before C2X. That we now seem
to have such issues worries me a little bit. 

In any case, I would argue that issues related to the size of integers are much
better understood by programmers, while excess precision is rather obscure and
also has much more implementation-defined degrees of freedom. The behavior of
integers is more or less fixed by its width, but with what precision 1. / 3. is
computed on any specific platform is not restricted. The use of such a thing in
a constexpr initializer then makes the program inherently non-portable and I do
not believe programmers are aware of this.  

Debugging such issues after the fact because a package fails to build on, for
example, 3 of 20 architectures in Debian is generally a huge pain.  On the
other hand, maybe excess precision  on i386 is obscure and i386 will go away
and we should not worry?

Reply via email to