"Dan Presley" <dpres...@...> wrote:
>         c = a/++a;
> 
> I got a compiler warning about the operation on a (the '++a')
> being undefined when I compiled the source.  When I ran the exe,

Mom told me about putting a fork in the toaster, but when I
tried it...

> ... My guess is the compiler, when executing the increment,

The compiler is just one part of the implementation. Optimisors
often work at a much lower and more capricious levels. With
this kind of code, what you see today may not be what you see
tomorrow.

> BTW, I use g++, the GCC C++ compiler.

Which is one of the few compilers that will actually warn you
of such (stupid) mistakes.

> I hope this helps solve your dilemma.

More important than whether the compiler can figure out the
intended semantics is whether a human can. I can't, so I
don't code like that. That is the solution to the 'dilemma'.

It can be interesting to study disasters to infinite degrees.
But seriously, this kind of code is one disaster that just
isn't worth analysing.

Instead, try to study what gcc does with, say...

  unsigned f(unsigned a) { return a % 10; }

-- 
Peter

Reply via email to