On 11/04/14 13:49, Ansgar Burchardt wrote:
> Hi,
>
> On 04/11/2014 12:42, Ian Jackson wrote:
>>
>> What people expect is that the compiler compiles programs the way C
>> was traditionally compiled.
> Shouldn't -O0 come close to that expectation?
I think that Ansgar's answer is spot on, but against all good sense, I
still want to expand it.

Neither the compiler nor its authors are doing anything out of spite. It
is, indeed, painful when a compiler optimizes away a security check due
to some standard defining a feature to be "undefined behavior". However,
for any such case there are hundreds in which this optimization saves on
an "if" that would strain the branch prediction cache, or allows
coalescing operations that would otherwise need to be done one after the
other, or any number of other cases in which the output machine language
looks nothing like your written high level C or C++.

Not only is this good for performance, it is also good for security. For
example, in C++ I can run the following code:

for( unsigned int i=0; i<size; ++i )
    vector1.at(i) = vector2.at(i);

"at" is better than square brackets, because it does bounds checking. I
think you'll agree with me that bounds checking is good for security.
Running this code as written, however, results in too many bounds
checking. Luckily, the same compiler optimization that angered you will
now realize that the bounds only ever need to be tested once. The result
is machine code that looks nothing like my C++ source, but which does
things both quickly and securely.

The quickly part is important. If I had to actually run the bounds
checking each and every time, this code would, likely, be too slow to be
practical. I would, then, have no choice but to use the version that
does not do bounds checking. I'd like to hear how that would make my
code more secure.

Alternatively, I might re-write the loop. This loop is relatively easy
to write with explicit bounds checking, but explicit bounds checking has
two major disadvantages:
1. It is easy to forget to do it correctly (see the heartbleed problem)
2. It makes the code less readable and less maintainable.

Both of those problems, again, translate to less secure code.

I, for one, accept the extra liability that modern optimizers provide in
exchange for the easier to maintain, more secure code they allow me to
write.

Shachar

Reply via email to