https://gcc.gnu.org/bugzilla/show_bug.cgi?id=122931

            Bug ID: 122931
           Summary: (uint64_v < _Maxof(uint32_t)) & ((uint32_t)uint64_v)
                    == 0 should simplify to uint64_v == 0
           Product: gcc
           Version: 16.0
            Status: UNCONFIRMED
          Keywords: missed-optimization
          Severity: normal
          Priority: P3
         Component: tree-optimization
          Assignee: unassigned at gcc dot gnu.org
          Reporter: pinskia at gcc dot gnu.org
  Target Milestone: ---

testcase:
```

int f(unsigned long long a)
{
  return a < _Maxof(unsigned int)
  & ((unsigned int)a) == 0;
}


int f1(unsigned long long a)
{
  return a < _Maxof(unsigned int)
  & (a & (unsigned long long)_Maxof(unsigned int)) == 0;
}
```

Both of these should simplify to just `return a == 0`.

I am using the Maxof extension because it is easier and not dependent on the
exact size of long long and int here :).

Reply via email to