RE: US-CERT Vulnerability Note VU#162289

2008-04-23 Thread Gerald.Williams
Dave Korn wrote:
[ ... lots of exciting commentary on scientific method/etc.
  that I leave out for the protection of the innocent ... ]

Huzzah! Way to stick it to the man! :-) :-)

   This VU falls massively far below the standards we have come to
expect
 from CERT, and should be withdrawn and reworked from scratch

Good idea, although they already did rework it, and I doubt
they're going to withdraw it when it really is a potential
vulnerability that was apparently detected in the wild.

Looking through the new version, it doesn't seem all that
bad to me. The only problem is the GCC note, which has an
untempered recommendation to consider old versions. That
warning is still misguided, but you're not going to get
very far trying to say it is entirely wrong. There *may
be* someone that could be negatively affected by moving
to a new version, and RCS has implied that they can name
a case where this is true. Maybe we can convince them to
temper the warning, I guess. [I mean really, changing the
compiler in any way could trigger vulnerabilities if you
have no idea what you're shoving into it. If you cannot
depend at all on the quality of your code, test it well
and never recompile it. But that path can easily devolve
into a religious debate.]

Meanwhile, there is an opportunity for a vendor response
that will be added verbatim. Is anyone working on one for
GCC? I think that would go a long way.

gsw


RE: US-CERT Vulnerability Note VU#162289

2008-04-14 Thread Gerald.Williams
Robert Dewar wrote:
 An optimziation is dubious to me if
 
 a) it produces surprising changes in behavior (note the importance of
 the word surprising here)
 
 b) it does not provide significant performance gains (note the
 importance of the word significant here).
 
 I find this optimization qualifies as meeting both criteria a) and b),
 so that's why I consider it dubious.

I don't think this is a particularly fruitful argument to be
having at this stage.

It's already been acknowledged that the source code is wrong
to assume that the compiler knows about wrapping of pointers.
The real issue at this stage is how to warn users who may be
using GCC and implicitly relying on its old behavior, without
unintentionally pushing people in the wrong direction. Since
this optimization is performed by many other compilers, the
ship has already sailed on this one, so to speak. [In fact,
after GCC does something to warn users about this, it'll be
much safer than those other compilers.]

I agree that on the face of it, it seems like you wouldn't
want to optimize away tests like this when you can know that
pointer arithmetic really does look the same as unsigned
arithmetic (for a particular architecture, etc.). However,
sometimes an optimization may enable thirty more, so I for
one am not going to argue against it. Especially not when
many other compilers do it also.

-Jerry

P.S. I'm having some déjà vu, recalling discussions back in
the GCC 2.7 days about whether it was really OK to change
the behavior for signed arithmetic to support devices with
saturation. We've come a long way since then.


RE: US-CERT Vulnerability Note VU#162289

2008-04-11 Thread Gerald.Williams
Robert C. Seacord wrote:
 Here is another version of the program (same compiler version/flags).
[...]
 void test_signed(char *buf) {
 signed int len;
[...]
 if((buf+len  buf) != ((uintptr_t)buf+len  (uintptr_t)buf))
 printf( BUG!);
[...]
 void test_unsigned(char *buf) {
 unsigned int len;
[...]
 if((buf+len  buf) != ((uintptr_t)buf+len  (uintptr_t)buf))
 printf( BUG!);
[...]
 The unsigned test was one we performed on the gcc versions.  I added
the
 signed test, but it didn't make a difference on Visual Studio.  My
 understanding is that it shouldn't, because the real issue here is
 pointer arithmetic and the resulting type should always be a pointer.

I'm not sure what you mean by that last statement.

I think we've already established that those tests that would
print BUG aren't actually finding bugs in the compiler. It
is not correct to assume that adding a value to a char pointer
will wrap around in the same way as if you added a value to an
unsigned number. You also cannot assume that adding a value to
a signed number will cause it to wrap. (GCC had optimized away
checks for the latter already. There are now reports that many
other compilers may optimize away both tests.)

Are we in agreement on this? The fact that your example prints
BUG! seems to imply that it is invalid to optimize away these
tests, which it isn't.

I was under the impression that the only issue here is that
there is a change in behavior. That's a very fine line to walk
when many other compilers do this. In fact, you can run into
the same change of behavior when switching from unoptimized
debug versions to optimized release versions. Recommending not
using a recent GCC as a possible remedy is dangerous (some
would probably say irresponsible). What you really mean is,
Use an older GCC or some other compiler that is known not to
take advantage of this optimization.

-Jerry

P.S. Has anyone checked how far back in the line of Microsoft
compilers you have to go to before they don't do this same
optimization (i.e., can we show irrefutably that this warning
should apply in the same degree to their compilers as well,
even without the debug versus release version argument)?
I suppose you only have to go as far back as the free version,
by inferring from the debug vs. release argument. :-)


RE: US-CERT Vulnerability Note VU#162289

2008-04-11 Thread Gerald.Williams
Robert C. Seacord wrote:
 this was only one of several solutions listed, and not the first one
 listed.

Yes, CERT did the right thing by recommending first that the
code be changed (kudos for that).

  What you really mean is,
 Use an older GCC or some other compiler that is known not to
 take advantage of this optimization.
 i think we mean what we say, which is *Avoid newer versions of gcc
and
 *avoiding the use of gcc versions 4.2 and later.  i don't see any
 verbiage that says use a different compiler.

I hope you can understand why that particular phrasing would be
viewed with some scorn, at least on the GCC list. Presumably,
the intent really is to suggest using a compiler that doesn't
have that optimization, not don't use recent GCC versions.

 our tests shows that the 2005 version of the compiler does not perform
 this optimization.  i have not yet tested a newer version of the
compiler.

There was a report (forwarded by Mark Mitchell) of Microsoft
Visual C++ 2005 performing that optimization (the resultant
object code was shown). Have you verified that this report
was false? If not, it may be that you were using a different
set of options or a different version of that compiler.

-Jerry


RE: US-CERT Vulnerability Note VU#162289

2008-04-09 Thread Gerald.Williams
Robert C. Seacord wrote:
 void f(char *buf)  {
   unsigned int len = len = 0xFF00;
 
   if (buf+len  buf) puts(true);
 
 }

You need to be more precise. That is not the same example
that you quoted for GCC.

In fact, if you vary the criteria too much, you will find
situations where GCC already behaved that way. The test in
the following example is optimized out by old versions of
GCC (certainly my version 3.4.5 compiler does it, with no
warnings even when using -Wall):

 int f(char *buf, int i)
 {
 i = 130;

 if ((int)buf + i  (int)buf)
 return 0;

 return 1;
 }

That's quite a bit less changed than your example, which
brings unsigned-ness into the picture. [This is exactly
the problem--signed overflow and pointer overflow aren't
defined, unlike unsigned overflow.]

Given that current Microsoft compilers reportedly exhibit
this behavior, it sounds like the advisory is going to at
least need some significant rewriting. :-)

-Jerry


RE: Progress on GCC plugins ?

2007-11-16 Thread Gerald.Williams
Joe Buck wrote:
 RMS believes that people who extend GCC, hoping to take their
extensions
 proprietary, and then finding that they can't, will then just decide
to
 contribute the code, if it is useful, since otherwise they can't
 distribute and have to support it by themselves forever, or else they
have
 to risk legal problems.  And he has some evidence that this sometimes
 happens (C++, Objective-C, many contributed back ends).  So the intent
 isn't to prevent certain people from using it, but to have those
people
 contribute the changes back even if that isn't their preference.
 
 Now that's fine as far as it goes, but when it becomes a defense
 of an opaque, non-extendable architecture we have a problem.

Agreed. It can also make it harder to contribute changes back, thus
possibly precluding some contributions.

-Jerry


RE: Progress on GCC plugins ?

2007-11-16 Thread Gerald.Williams
Much as I hate prolonging a probably-pointless discussion...

I hope we aren't thinking about keeping things difficult for
everybody simply because everybody includes some people who
may want to take advantage of GCC in a proprietary way. In
the long term, this only benefits the folks you'd be trying
to exclude.

Think about it. You have nothing to fear from people writing
trivial add-ons (if they're useful, they'll be duplicated in
open source; if not, they'll fade away). The only people you
might need to worry about would be those writing significant
new compiler designs/enhancements using GCC as a starting
point (and possibly trying to get past the GPL by avoiding
direct linking/etc.). Yet as has already been pointed out,
they already can do this by creating their own interface to
plug into. If they use a standard interface (since the code
base would favor this), it would be easier to replace their
plug-ins with open-source alternatives later.

-Jerry


RE: Progress on GCC plugins ?

2007-11-09 Thread Gerald.Williams
Mark Mitchell wrote:
 Anyhow, in practical terms, debating this here probably will have zero
 impact on the outcome.  The ball is in RMS' court, and SC members
 (including myself) have made many of the arguments that have been made
 in this thread.  If people want to influence the FSF, the best
approach
 is probably going to be sending mail directly to RMS, not discussing
it
 on this list.

I think your opinion probably carries a bit more weight
with RMS than mine. :-)

I don't want to prolong a pointless discussion either,
but I hope somebody has made the forest through the
trees argument. Proprietary extensions using a standard
plug-in interface will eventually get replaced by free
ones if they're of value. In the long term, this type of
direct assault on GPL would backfire in a big way.

-Jerry