https://gcc.gnu.org/bugzilla/show_bug.cgi?id=78429
--- Comment #3 from Eric Botcazou <ebotcazou at gcc dot gnu.org> --- It looks like this guard in set_and_canonicalize_value_range: /* For one bit precision if max < min, then the swapped range covers all values, so for VR_RANGE it is varying and for VR_ANTI_RANGE empty range, so drop to varying as well. */ if (TYPE_PRECISION (TREE_TYPE (min)) == 1) { set_value_range_to_varying (vr); return; } is bypassed because we have a rather surprising boolean type: (gdb) p debug_tree(min) <integer_cst 0x7ffff6d85c90 type <boolean_type 0x7ffff6d7b888> constant 0> (gdb) p debug_tree((tree)0x7ffff6d7b888) <boolean_type 0x7ffff6d7b888 public SI size <integer_cst 0x7ffff6c34ee8 type <integer_type 0x7ffff6c39150 bitsizetype> constant 32> unit size <integer_cst 0x7ffff6c34f00 type <integer_type 0x7ffff6c390a8 sizetype> constant 4> align 32 symtab 0 alias set -1 canonical type 0x7ffff6d7b888 precision 32 min <integer_cst 0x7ffff6d85a20 -2147483648> max <integer_cst 0x7ffff6d85bb8 2147483647>>