https://gcc.gnu.org/bugzilla/show_bug.cgi?id=93105

            Bug ID: 93105
           Summary: Wrong optimization for pointers: provenance of `p +
                    (q1 - q2)` is treated as `q` when the provenance of
                    `p` is unknown
           Product: gcc
           Version: 10.0
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: rtl-optimization
          Assignee: unassigned at gcc dot gnu.org
          Reporter: ch3root at openwall dot com
  Target Milestone: ---

gcc seems to wrongly infer provenance of a pointer expression of the form `p +
(q1 - q2)` when the following conditions hold:
- the provenance of the pointer `p` couldn't be tracked;
- the provenance of `q1` or `q2` is known;
- `q1 - q2` couldn't be simplified to get rid of pointers.

----------------------------------------------------------------------
#include <stdio.h>

__attribute__((noipa)) // imagine it in a separate TU
static int *opaque(int *p) { return p; }

int main()
{
    static int x, y;

    int *r = opaque(&x) + (opaque(&y) - &y);

    x = 1;
    *r = 2;
    printf("x = %d\n", x);
}
----------------------------------------------------------------------
$ gcc -std=c11 -pedantic -Wall -Wextra test.c && ./a.out
x = 2
$ gcc -std=c11 -pedantic -Wall -Wextra -O3 test.c && ./a.out
x = 1
----------------------------------------------------------------------
gcc x86-64 version: gcc (GCC) 10.0.0 20191230 (experimental)
----------------------------------------------------------------------

The problem is similar to pr49330. Analysis by Alexander Monakov via bug 49330,
comment 29:

"It's a separate issue, and it's also a regression, gcc-4.7 did not miscompile
this. The responsible pass seems to be RTL DSE."

Reply via email to