On Friday, 21 September 2012 at 04:44:58 UTC, Andrei Alexandrescu wrote:
Andrei Alexandrescu<seewebsiteforem...@erdani.org>  wrote:
My claim is unremarkable. All I'm saying is the minimum running time of an algorithm on a given input is a stable and indicative proxy for the behavior of the algorithm in general. So it's a good target for optimization.

I reached the same conclusion and use the same method at work.

Considering min will converge towards a stable value quite quickly... would it not be a reasonable default to auto detect when the min is stable with some degree of statistical certainty...?

Reply via email to