https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84562

            Bug ID: 84562
           Summary: -faggressive-loop-optimizations makes decisions based
                    on weak data structures
           Product: gcc
           Version: 8.0.1
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: c
          Assignee: unassigned at gcc dot gnu.org
          Reporter: jnordholz at sect dot tu-berlin.de
  Target Milestone: ---

Created attachment 43501
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=43501&action=edit
minimal example for both cases (int and array-of-char*)

(Minimal test case attached as source tar.gz, as the problem is inherently
multi-compilation-unit and doesn't involve any headers. Snippets below for
illustration.)

Compiling the following code with -c -O2...

__attribute__((weak)) const int y = 0;

void foo(void) {
    for (i = 0; i < y; i++) {
        write(2, "X", 1);
    }
}

yields an empty foo() even though 'y' is weak (which I consider a bug, as
linking with another unit with a proper symbol 'y' with nonzero value will Do
The Wrong Thing and not even issue a warning - -fsanitize=undefined cannot help
either, as the whole loop is optimized away).

In a similar case, having

__attribute__((weak)) const char *arr[] = { NULL };
[...]
    for (i = 0; arr[i]; i++) {

in one unit and overriding 'arr' in another with

const char *arr[] = { "abc", "def", NULL };

causes related problems: this time the loop doesn't disappear completely, so
-fsanitize=undefined complains about illegal array accesses - but I don't
believe this is justified, unless the different implicit array size is illegal
by the C standard; I couldn't find anything on that.

Tested with Debian gcc7 (7.3.0) and gcc8 (8.0.1) on x86_64, thus assigning to
the latest version.

Reply via email to