On 11/05/2023 04:09, Po Lu via Gcc wrote:
jwakely....@gmail.com (Jonathan Wakely) writes:

So let's do it. Let's write a statement saying that the GCC developers
consider software security to be of increasing importance, and that we
consider it irresponsible to default to accepting invalid constructs in the
name of backwards compatibility. State that we will make some changes which
were a break from GCC's traditional stance, for the good of the ecosystem.

I'm sorry you think that way.

Given recent pushes to discourage or outright ban the use of memory-safe
languages in some domains, I think it would be good to make a strong
statement about taking the topic seriously. And not just make a statement,
but take action too.

If we don't do this, I believe it will harm GCC in the long run. The vocal
minority who want to preserve the C they're used to, like some kind of
historical reenactment society, would get their wish: it would become a
historical dead end and go nowhere.

Vocal minority? Do you have any evidence to back this claim?

What I see is that some reasonable organizations have already chosen
other C compilers which are capable of supporting their existing large
bodies of C code that have seen significant investment over many years,
while others have chosen to revise their C code with each major change
to the language.

The organizations which did not wish to change their code did not
vocally demand changes to GCC after GCC became unsuitable, but quietly
arranged to license other compilers.

Those that continue write traditional C code know what they are doing,
and the limitations of traditional C do not affect the quality of their
code.  For example, on the Unix systems at my organization, the SGS is
modified so that it will not link functions called through a declaration
with no parameter specification with a different set of parameters than
it was defined with.

Naturally, the modified linker is not used to run configure scripts.


Let's be absolutely clear here - gcc has been, and will continue to be, able to compile code according to old and new standards. It can handle K&R C, right through to the cutting edge of newest C and C++ standards. It can handle semantic requirements such as two's complement wrapping and "anything goes" pointer type conversions - features that a lot of old code relies on but which are not documented or guaranteed behaviour for the vast majority of other compilers. It can handle all these things - /if/ you pick the correct flags.

With the proposed changes, you can still compile old K&R code with gcc - if you give it the right flags. No features are being removed - only the default flags are being changed. If anyone is changing from gcc to other compilers because they think newer gcc does not support older code, then they are perhaps doing so from ignorance.

If some users are willing to change to different compilers, but unwilling to learn or use new flags in order to continue using their existing compiler after it changes its defaults, then perhaps gcc could pick different defaults depending on the name used for the executable? If it is invoked with the name "gcc-kr", then it could accept K&R code by default and have "-std=gnu90" (I believe that's the oldest standard option). If it is invoked as "gcc", then it would reject missing function declarations, implicit int, etc., as hard errors.

Then these users could continue to use gcc, and their "new" compiler to handle their old code would be nothing more than a symbolic link.

David





Reply via email to