https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84562
--- Comment #4 from rguenther at suse dot de <rguenther at suse dot de> --- On Tue, 27 Feb 2018, jnordholz at sect dot tu-berlin.de wrote: > https://gcc.gnu.org/bugzilla/show_bug.cgi?id=84562 > > Jan Nordholz <jnordholz at sect dot tu-berlin.de> changed: > > What |Removed |Added > ---------------------------------------------------------------------------- > Status|RESOLVED |UNCONFIRMED > Resolution|INVALID |--- > > --- Comment #2 from Jan Nordholz <jnordholz at sect dot tu-berlin.de> --- > Hi, > > sorry for reopening, but I don't think the comment properly addresses the bug > report. > > a) This is not about C++ - the example is pure C, and weak definitions are an > established mechanism. I understand that this was about C. > b) I don't see how the overriding of a weak 'const int y' with a strong 'const > int y' might count as an "incompatible definition". The implicit-sized arrays > might be a different story, true, but I can't see how you've refuted my first > example. > > I understand that this is probably a minor issue, as weak objects are probably > only used by a minority of developers. Still, gcc silently generates buggy > code > which could only be prevented by either > 1. moving the weak definition into a different compilation unit than (all) the > code that uses it or > 2. by compiling at less than -O2. > > If you consider this too low-prio, I'd gladly try to whip up a patch myself if > I find the time. Can you split this issue into two then? The first example is really different from the others.