[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294 --- Comment #2 from Robert Clausecker --- Created attachment 36689 --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=36689=edit Testcase for bug #68294
[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294 --- Comment #3 from Robert Clausecker --- Sorry, I forgot to attach the test case. Here it is.
[Bug tree-optimization/68294] gcc cannot deduce (a | b) != 0 from (a != 0 && b != 0)
https://gcc.gnu.org/bugzilla/show_bug.cgi?id=68294 --- Comment #1 from Marc Glisse --- Please always include a compilable testcase so everyone doesn't have to reinvent one from your explanations. Gcc knows that u|v is not 0 (from VRP), but does not take advantage of that information in this case.