https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113837

--- Comment #5 from H.J. Lu <hjl.tools at gmail dot com> ---
(In reply to Jakub Jelinek from comment #1)
> Ugh no, please don't.
> This is significant ABI change.
> First of all, zeroing even for signed _BitInt is very weird, sign extension
> for that case is more natural, but when _BitInt doesn't have any unspecified
> bits, everything that computes them will need to compute even the extra
> bits.  That is not the case in the current code.

Can we compare zeroing and undefined codegen of unused bits for storing
signed _BitInt?

Reply via email to