On Mon, Jul 19, 2010 at 12:18 PM, Nadav Har'El <n...@math.technion.ac.il>wrote:

> Imagine, for example, that you run a certain program 5 times and get the
> times: 20.0, 18.0, 18.1, 27.0, 18.1
> Evidently, the first run was slower because things were not in the cache,
> and the run that took 27.0 was delayed by some other process in the
> background
> taking up the CPU or disk. The minimum run time, 18.0, is the most
> interesting
> one - it is the time a run would take every time, if things were perfect.
> If you average the above numbers, or find the standard deviation, etc.,
> the numbers would not be very interesting...
>

Just heard my intuition against that claim recited by a master, Joshua Bloch
of the "Effective Java" fame.

Long story short, he claims there that modern computers are now highly
non-deterministic, he demonstrated 20% running time variation by the same
JVM running the same code. He claims like I felt, you must employ statistics
on benchmark to get a meaningful result, and  I think it implies that
minimum is not the way to go here. I recommend this 30 minutes talk without
any relation to the discussion.

Video: http://parleys.com/#id=2103&sl=12&st=5
slides: http://wiki.jvmlangsummit.com/images/1/1d/PerformanceAnxiety2010.pdf
_______________________________________________
Linux-il mailing list
Linux-il@cs.huji.ac.il
http://mailman.cs.huji.ac.il/mailman/listinfo/linux-il

Reply via email to