https://gcc.gnu.org/bugzilla/show_bug.cgi?id=63224

--- Comment #4 from Manuel López-Ibáñez <manu at gcc dot gnu.org> ---
I'm sure that bug is already filled and analyzed, but I cannot find it right
now. 

The problem there (and probably here) is that a && b is converted to a & b, and
b is uninitialized if and only if (a == 0) but the uninit pass is not smart
enough to realize this.

Reply via email to