https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113226
Patrick Palka <ppalka at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |ppalka at gcc dot gnu.org --- Comment #1 from Patrick Palka <ppalka at gcc dot gnu.org> --- Huh, how bizarre. > i == 1, j == -100, i*j == 4294967196, max_type(i) == 1, max_type(i)*j == -100 Here i and j are just ordinary 'long long', so I don't get why i*j is 4294967196 instead of -100?