https://gcc.gnu.org/bugzilla/show_bug.cgi?id=113703

Richard Biener <rguenth at gcc dot gnu.org> changed:

           What    |Removed                     |Added
----------------------------------------------------------------------------
           Keywords|                            |wrong-code
                 CC|                            |rguenth at gcc dot gnu.org

--- Comment #1 from Richard Biener <rguenth at gcc dot gnu.org> ---
I think the point is we fail to represent

Analyzing # of iterations of loop 1
  exit condition [i_5(D) + 1, + , 1] < n_11(D)
  bounds on difference of bases: -18446744073709551615 ... 18446744073709551615
  result:
    zero if i_5(D) + 1 > n_11(D)
    # of iterations (n_11(D) - i_5(D)) + 18446744073709551615, bounded by
18446744073709551615
  number of iterations (n_11(D) - i_5(D)) + 18446744073709551615; zero if
i_5(D) + 1 > n_11(D)

specifically the 'zero if i_5(D) + 1 > n_11(D)'

I think may_eliminate_iv is wrong here, maybe not considering overflow
of the niter expression?

I wonder if it is possible to write a runtime testcase that FAILs with
reasonable memory requirement/layout.

Reply via email to