On 08.04.2012 20:59, Denis Shelomovskij wrote:
Very good but minimum isn't a best guess. Personally I (and there will
be a lot of such maniacs I suppose) will think that this (minimum) time
can be significantly smaller than average time.


Prime example is networking.

So a parameter (probably with a default value) should be added.
Something like enum of flags telling what we want to know. At least
these looks usable: minTime, <some mean time>, maxTime,
standardDeviation,

+1

 graph (yes, good old ASCII art).

Yes, graph is needed.


No not ASCII art. It's too coarse grained and the only good thing it can do is look nicely in terminal (or give an overall picture but it's limited to say ~25 blocks in a histogram).

What would be nice to have an option to output bare-bones info so that one can easily adapt it and feed the result to e.g. gnuplot.
I think csv or it's version with tabs is good enough in this case.

Another cool addition IMHO would be parametric benchmarks, so there is a function and a set of parameters (one parameter is fine too) to benchmark on. It makes that much more sense with graphs as algorithm profile plotted for various inputs (sizes) can be very revealing.

--
Dmitry Olshansky

Reply via email to