https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68835
Jakub Jelinek <jakub at gcc dot gnu.org> changed: What |Removed |Added ---------------------------------------------------------------------------- CC| |rsandifo at gcc dot gnu.org Assignee|jakub at gcc dot gnu.org |unassigned at gcc dot gnu.org --- Comment #8 from Jakub Jelinek <jakub at gcc dot gnu.org> --- The > HOST_BITS_PER_WIDE_INT stuff is just weird. p debug_tree (build_nonstandard_integer_type (66, 1)) <integer_type 0x7ffff18170a8 public unsigned TI size <integer_cst 0x7ffff1713cf0 type <integer_type 0x7ffff17172a0 bitsizetype> constant 128> unit size <integer_cst 0x7ffff1713d08 type <integer_type 0x7ffff17171f8 sizetype> constant 16> align 128 symtab 0 alias set -1 canonical type 0x7ffff18170a8 precision 66 min <integer_cst 0x7ffff1827180 0> max <integer_cst 0x7ffff17302b8 0xffffffffffffffffffffffffffffffff>> (gdb) p debug_tree (build_int_cst (0x7ffff18170a8, 1)) <integer_cst 0x7ffff18271b0 type <integer_type 0x7ffff18170a8> constant 1> (gdb) p debug_tree (int_const_binop (MINUS_EXPR, 0x7ffff17302b8, 0x7ffff18271b0)) <integer_cst 0x7ffff1827198 type <integer_type 0x7ffff18170a8> constant 0xfffffffffffffffffffffffffffffffe> (gdb) p debug_tree (build_int_cst (0x7ffff18170a8, 2)) <integer_cst 0x7ffff18271e0 type <integer_type 0x7ffff18170a8> constant 2> (gdb) p debug_tree (int_const_binop (TRUNC_DIV_EXPR, 0x7ffff17302b8, 0x7ffff18271e0)) <integer_cst 0x7ffff1824ea0 type <integer_type 0x7ffff18170a8> constant 0x1ffffffffffffffff> The max value of the unsigned type (for signed types both the min and max values look ok) looks wrong, and so does the max - 1 value, but the max / 2 value is printed correctly. This can be surely a pretty printer bug, or could be a bug in wide_int_to_tree. Depends on what should be the canonical representation of precision > HOST_BITS_PER_WIDE_INT (and < 2 * HOST_BITS_PER_WIDE_INT) unsigned values in between maximum and maximum - LLONG_MAX.