On Wednesday, 22 April 2015 at 20:29:49 UTC, Walter Bright wrote:
On 4/22/2015 12:51 PM, ponce wrote:
I didn't appreciate how important default initialization was
before having to
fix a non-deterministic, release-only, time-dependent bug in a
video encoder
some months ago. Just because of 2 uninitialized variables
(C++ doesn't require
member initialization in constructor). If one of them was
_exactly equal to 1_
by virtue of randomness, then it would perform from 0 to 2
billions of motion
estimation steps, which is very slow but not a total halt. A
watchdog mechanism
would detect this and reboot, hence labelling the bug "a
deadlock". It would
disappear in debug mode since variables would be initialized
then.
The default initialization comes from bitter personal
experience, much like yours!
That gives a totally other meaning to "zero cost abstractions"
since in three
weeks of investigation I could have speed-up the program by
~5%, much more than
the supposed slowdown of variable initialization.
Most of the implicit initializations become "dead stores" and
are removed anyway by the optimizer.
Is it even possible to contrive a case where
1) The default initialisation stores are technically dead and
2) Modern compilers can't tell they are dead and elide them and
3) Doing the initialisation has a significant performance impact?
The boring example is "extra code causes instruction cache
misses".