John Regehr wrote:
Optimizations based on uninitialized variables make me very nervous.
If uninitialized memory reads are transformed into don't-cares, then
checking tools like valgrind will no longer see the UMR (assuming that
the lack of initialization is a bug).

Well that's the way things are, you cannot count on any specific
behavior of the compiler if there are references to uninitialized
variables, and if valgrind depends on such specific behavior, it
is wrong to do so. Now of course in practice valgrind HAS to rely
on this behavior, but realistically, it cannot be expected to be
reliable if optimization is turned o.

Did I understand that icc does this?  It seems like a dangerous practice.

On the contrary, to me, trying to define undefined is what is dangerous!
Sure, in cases where there is a strong expectation of a particular behavior, and programs expect a certain behavior, you can have a debate, but no program has deliberate use of references to uninitialized variables legitimately expecting some particular behavior.

Yes, it looks like icc does this. But so does gcc, see below. There is no "add" in the generated code.

John Regehr


[reg...@babel ~]$ cat undef.c
int foo (int x)
{
   int y;
   return x+y;
}
[reg...@babel ~]$ current-gcc -O3 -S -o - undef.c -fomit-frame-pointer
         .file   "undef.c"
         .text
         .p2align 4,,15
.globl foo
         .type   foo, @function
foo:
         movl    4(%esp), %eax
         ret
         .size   foo, .-foo
         .ident  "GCC: (GNU) 4.5.0 20091117 (experimental)"
         .section        .note.GNU-stack,"",@progbits

Reply via email to