On 19/02/2021 09:45, Florian Weimer wrote:
* David Brown:
On 18/02/2021 13:31, Florian Weimer via Gcc wrote:
* Jonathan Wakely via Gcc:
Declare your functions. Don't ignore warnings.
It's actually a GCC bug that this isn't an error. However, too many
configure scripts would still break if we changed the default.
People have had 22 years to fix them. Implicit function declarations
were a terrible idea from day 1, and banned outright in C99. It was
reasonable for them to be accepted when the default C standard for gcc
was "gnu90" - they should have never been acceptable in any later
standards without needing an explicit flag. Since gcc 5 they have given
a warning by default - surely it is time for them to be a hard error?
Just to be clear - I am not in any way suggesting that this situation is
the fault of any gcc developers. If configure scripts are failing
because they rely on poor C code or inappropriate use of gcc (code that
requires a particular C standard should specify it - gcc has the "-std="
flags for that purpose), then the maintainers of those scripts should
fix them. If Fedora won't build just because the C compiler insists C
code is written in C, then the Fedora folk need to fix their build system.
I appreciate that consistency and compatibility with existing, old and
unmaintained code bases, configures and build systems is important. But
the cost is that people continue to make the same mistakes they did
before, they continue to write buggy code, and they continue to cause
crashes, security holes, and other trouble. At least some problems
could be stopped entirely by checks from tools like gcc. There is never
going to be a catch-all "-Wbug-in-the-program" warning, but I really
don't think it is unreasonable for a compiler to give an error on code
that is considered so bad it is no longer supported by the language.
The big problem I see is that as long as tools turn a blind eye (or at
least a tolerant eye) to code faults in order to retain compatibility
with older and poorer code bases, they let the same mistakes through in
/new/ code.
Have you actually tried to make the change and seen what happens?
No - my comments are entirely wishful thinking. I realise the decisions
for this kind of thing have to be made by people who /have/ tried it,
and see the effects on a wide range of software - not by someone like me
who uses gcc primarily for his own code.
I fixed one bug in GCC less than two years ago because apparently, I was
the first person trying to change the GCC default for real. This was
actually my second attempt, this time using Jeff's testing
infrastructure. The first attempt totally broke Fedora, so we gave up
immediately and never even got as far as encountering the GCC bug. The
second attempt got a little bit further, fixing bash, gawk, gettext,
gnulib, make. Maybe that's all of GNU that needed fixing, but that
seems unlikely (I didn't get through the full list of failing
components). There were also many failures from other sources. Some
looked rather hard to fix, for example unzip
<https://bugzilla.redhat.com/show_bug.cgi?id=1750694>. In many cases
key system components were affected where the upstream status is a bit
dubious, so there is no good place for distributions coordinating their
fixes and share the effort.
This is just another thing that is unfixable in (GNU) C. Personally, I
have stopped caring as long as the problem is not present in C++.
I do understand that. I am not really expecting gcc to change its
defaults here - the practicalities involve too much work. But it does
not stop me /wanting/ a change, or believing the software world would be
better for having such a change.
David