On 9/20/12 3:01 PM, foobar wrote:
On Thursday, 20 September 2012 at 12:35:15 UTC, Andrei Alexandrescu wrote:
Let's use the minimum. It is understood it's not what you'll see in
production, but it is an excellent proxy for indicative and
reproducible performance numbers.
Andrei
From the responses on the thread clearly there isn't a "best way".
I don't quite agree. This is a domain in which intuition is having a
hard time, and at least some of the responses come from an intuitive
standpoint, as opposed from hard data.
For example, there's this opinion that taking the min, max, and average
is the "fair" thing to do and the most informative. However, all noise
in measuring timing is additive. Unless you talk about performance of
entire large systems with networking, I/O, and the such, algorithms
running in memory are inevitably spending time doing work, to which
various sources of noise (system interrupts, clock quantization,
benchmarking framework) just _add_ some time. Clearly these components
do affect the visible duration of the algorithm, but if you want to
improve it you need to remove the noise.
There are different use-cases with different tradeoffs so why not allow
the user to choose the policy best suited for their use-case?
I'd suggest to provide a few reasonable common choices to choose from,
as well as a way to provide a user defined calculation (function
pointer/delegate?)
Reasonable choices are great, but in this case it's a bit difficult to
figure what's reasonable.
Andrei