This is happening in gcc.dg/tree-ssa/20040121-1.c.  The test
specifically tests that (p!=0) + (q!=0) should be computed as
int:

char *foo(char *p, char *q) {
    int x = (p !=0) + (q != 0);
    ...
}

During VRP, we get this IL

  D.1294_10 = first_8 != 0B;
  D.1295_11 = last_9 != 0B;
  x_12 = D.1294_10 + D.1295_11;

The ranges we have at this point are

first_8: ~[0, 0]
last_9: ~[0, 0]
D.1294_10: [1, 1]       <-- _Bool 'true'
D.1295_11: [1, 1]       <-- _Bool 'true'

When we call int_const_binop to fold 'true' + 'true' it returns
'false', and so we end up with the range [0, 0] for x_12, which
causes the test to fail.

I don't know who's at fault here.  VRP is doing exactly what the
IL tells it to.  Is this program legal C?  Or should the FE emit
casts to int here?


Thanks.  Diego.

Reply via email to