On 2019/6/12 4:04 PM, Richard Biener wrote:
On Wed, Jun 12, 2019 at 5:22 AM Li Jia He <heli...@linux.ibm.com> wrote:

Hi,

I recently did some analysis on the automatic vectorization of gcc, I
found that singed char can not be vectorized in the following code.

---
#define ITERATIONS 1000000

#if defined(do_reduce_signed_char)
#define TYPE signed char
#elif defined(do_reduce_unsigned_char)
#define TYPE unsigned char
#else
#error bad define
#endif

#define SIZE (16384/sizeof(TYPE))

static TYPE x[SIZE] __attribute__ ((aligned (16)));

void obfuscate(void *a, ...);

static void __attribute__((noinline)) do_one(void)
{
      unsigned long i;
      TYPE a = 0;

      obfuscate(x);

      for (i = 0; i < SIZE; i++)
          a += x[i];

      obfuscate(x, a);
}

int main(void)
{
      unsigned long i;

      for (i = 0; i < ITERATIONS; i++)
          do_one();

      return 0;
}
---
If we use the following command line

gcc reduce.c -Ddo_reduce_unsigned_char -Ofast -c -S -fdump-tree-vect-details

We can see that this code can be vectorized under the unsigned char data
type.
If we use the following command

gcc reduce.c -Ddo_reduce_signed_char -Ofast -c -S -fdump-tree-vect-details

We can see that this code cannot be vectorized under the singed char
data type.
I found in the below code for singed char
---
a += x[i];
---
Will do something like the following conversion.
---
a = (signed char) ((unsigned char) x[i] + (unsigned char) a);
---
As a result, the reduction in the code cannot be effectively identified.
Can we vectorize the code like the above when the data type is signed char ?

Probably another case of https://gcc.gnu.org/PR65930

Richard.
Thanks to Kyrill and Richard, I have subscribed to this issue.

Thanks,
Lijia He



Reply via email to