https://gcc.gnu.org/bugzilla/show_bug.cgi?id=104375

            Bug ID: 104375
           Summary: [x86] Failure to recognize bzhi patter nwhen shr is
                    present
           Product: gcc
           Version: 12.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: target
          Assignee: unassigned at gcc dot gnu.org
          Reporter: gabravier at gmail dot com
  Target Milestone: ---

#include <stdint.h>

uint64_t bextr_u64(uint64_t w, unsigned off, unsigned int len)
{
        return (w >> off) & ((1U << len) - 1U);
}

With -mbmi2, this can be optimized to using shrx followed by bzhi. This
transformation is done by LLVM, but not by GCC.


PS: Even in the case where the shr is removed and thus the bzhi pattern is
recognized (e.g. `return w & ((1U << len) - 1U);`), it is still not compiled
optimally as it for some reason decides to put the result of the bzhi in an
intermediary register before moving it to eax.

Reply via email to