https://gcc.gnu.org/bugzilla/show_bug.cgi?id=121938
--- Comment #9 from Andrew Pinski <pinskia at gcc dot gnu.org> --- (In reply to post+gcc from comment #8) > > This works though: > > Makes sense, given that using hex notation also works (IIUC that is > implicitly unsigned). Well it is invalid C otherwise ... > > But if the literal is ambiguous about its sign, then somehow > 0xffffffff80000000 is a cutoff point.
