Neil,

I'm not sure I understand what you mean by the following:

A program that does not satisfy this constraint is erroneous, and many compilers take advantage of this constraint to optimize code more effectively.
Just because a program contains undefined behavior, does not mean that it 
erroneous. It simply gives the compiler latitude with how to handle the 
undefined behavior, while still conforming.

One possibility is that GCC could handle these constructs in a consistent manner. That is, GCC clearly implements modwrap semantics. Given this, I think the behavior exhibited in this case is inconsistent. If, on the other hand, GCC implemented saturation semantics, it would make perfect sense to optimize out this check.
I'll review your comments below and consider additional changes to the text of 
the vul note.  We would really like to see is a diagnostic for this condition, 
ideally enabled by -Wall.  Once a version of the compiler is available that 
provides such a diagnostic is available, I will recommend we change the US-CERT 
Addendum to recommend that developers upgrade to that version of the compiler, 
and compile using -Wall or whatever the appropriate flag turns out to be.

I am getting tired with the personal/organizational attacks.  If you expect a 
response, please keep your comments professional.

rCs



David Miller wrote:-

From: Joe Buck <[EMAIL PROTECTED]>
Date: Wed, 23 Apr 2008 08:24:44 -0700

If CERT is to maintain its reputation, it needs to do better.  The warning
is misdirected in any case; given the very large number of compilers that
these coding practices cause trouble for, you need to focus on the bad
coding practices, not on unfair demonization of new GCC releases.
In my opinion CERT's advisory has been nothing but an unfair FUD
campaign on compilers, and GCC specifically, and has seriously
devalued CERT's advisories, in general, which were already of low
value to begin with.

It looks similar to a news article run by a newspaper that is losing
money and has no real news to write about, but yet they have to write
about something.

The worst part of this fiasco is that GCCs reputation has been
unfairly harmed in one way or another, and there is nothing CERT can
do to rectify the damage they've caused.

I'm appalled that the original desciption hasn't been corrected.  The
text reads:
        
  Some C compilers optimize away pointer arithmetic overflow tests that
  depend on undefined behavior without providing a diagnostic (a warning).
  Applications containing these tests may be vulnerable to buffer
  overflows if compiled with these compilers.

  I. Description
  In the C language, given the types:

        char *buf;
        int len;

some C compilers will assume that buf+len >= buf.
which is an entirely bogus description of the problem.  That this
incorrect description of the state of affairs has been left to
stand only shows that CERT, and those responsible for this advisory,
have completely failed to understand what the real issue is here.
Further, the fact that the "advisory" stands in this erroneous
form, despite it having been pointed out to them many times over
the past weeks on this form at least, seriously undermines their
credibility in the eyes of any informed observer.

At a minimum the wording should be something more like:

  In the C language, given an object OBJ and a pointer BUF into OBJ,

        char *buf;
        int len;

  the C standard requires that the result of

        buf + len

  must point within, or one byte beyond, the object BUF.  A program
  that does not satisfy this constraint is erroneous, and many
  compilers take advantage of this constraint to optimize code more
  effectively.  Unforunately much existing code is not well written
  and sometimes erroneous in this regard, and hence may not behave
  as originally intended when compiled with optimizations enabled.

Neil.

Reply via email to