Vincent Lefevre wrote:
On 2009-01-27 07:08:51 -0500, Robert Dewar wrote:
[...]
Interestingly, my viewpoint on this from the compiler domain might
seem to be quite inconsistent. I think it is generally a bad idea
for compilers to aggressively optimize based on assumptions that
programs are free of these kinds of mistakes.
[...]

Can you define "aggressively optimize" please? :)

I mean doing optimizations based on assuming that a program is well
defined, yes, it's hard to formalize this requirement, especially
if you try to formalize the specific rule that information not be
back propagated (the sort of reasoning that says an if must go
this way, because if it goes the other way it will result in
undefined behavior). It's more like a principle which has to be
discussed in each case

(This term is completely meaningless, IMHO, or if you prefer, highly
subjective.)

It does not have a formal definition, but it's a useful principle.
For instance let's apply it in this case

Derefernecing null is undefined

Suppose instead we say that it has an implementation defined effect,
then general implementation independent optimization cannot make
any assumptions that a derefernence of null cannot happen. I assume
that's roughly what the -f switch does in this case. An interesting
question is whether this -f switch is formally defined, I doubt it
is, but that doesn't mean it is useless.


Reply via email to