Hello all,

I'm working on an optimization algorithm (based on some software written in C), 
and will need to be able to get precise timing information for how long it 
takes to find each solution. The optimizations will likely be very fast (tens 
of microseconds perhaps) for some of my test problems.

I'm not familiar with the accuracy of the @time macro. Does it accurately time 
execution to within a microsecond? It will time functions as having execution 
times of a few microseconds, but I don't know how accurate that actually is.

The C code this is based on uses clock_gettime to get timing information 
accurate to a few nanoseconds. I don't quite need that accuracy, but the unix 
system clock "time" accuracy of 10 milliseconds isn't acceptable. 

Also, running many times to get the time isn't an option as I am actually 
interested in the distribution of times, as it is a heuristic algorithm. I want 
to know how long would need to run it to have found a solution with, say, 99% 
confidence, and so need the actual distribution of times to solution.

Thanks!

Reply via email to