[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)

2015-11-11 Thread fuz at fuz dot su
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294

--- Comment #2 from Robert Clausecker  ---
Created attachment 36689
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=36689=edit
Testcase for bug #68294

[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)

2015-11-11 Thread fuz at fuz dot su
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294

--- Comment #3 from Robert Clausecker  ---
Sorry, I forgot to attach the test case. Here it is.

[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)

2015-11-11 Thread glisse at gcc dot gnu.org
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294

--- Comment #1 from Marc Glisse  ---
Please always include a compilable testcase so everyone doesn't have to
reinvent one from your explanations.

Gcc knows that u|v is not 0 (from VRP), but does not take advantage of that
information in this case.