https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95070

            Bug ID: 95070
           Summary: vec_cntlz_lsbb implementation uses BE semantics on LE
           Product: gcc
           Version: 8.3.1
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: target
          Assignee: unassigned at gcc dot gnu.org
          Reporter: pc at us dot ibm.com
  Target Milestone: ---

Created attachment 48512
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=48512&action=edit
test case

This:
--
        vector unsigned char a = { 0xFF, 0xFF, 0x00, 0x00, 0x00, 0x00, 0x00,
0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00 };
        int r = vec_cntlz_lsbb (a);
--
returns 14 on LE and 0 on BE. It should return 0 on both.

vec_cntlz_lsbb counts bytes with least significant bits of 0 *starting from the
lowest element number*.  In the above code, a[0] == 0xFF, so the count should
find 0 bytes.

The same issue occurs with vec_cnttz_lsbb (which should find 14 bytes in the
above example on both LE and BE, but finds 0 and 14, respectively).

Reply via email to