Michael Jones wrote:
I need to do a timer in my program with precision greater than seconds.  I
am aware of TimGetTicks, but was unsure how predictable this is.  The way
the API reads makes me think that SysTicksPerSecond can return different
results at different times - which, over the period of time that is being
tracked, might fluctuate and distort the results if my end calculation
depends on consistently timed SysTicks being produced.

Consider the following pseudo-code:

startTime = SysGetTicks();

...then once timer is over doing the following:

endTime = SysGetTicks();

elapsedTime = endTime - startTime;

numSeconds = elapsedTime / SysTicksPerSecond
runRemainder = (elapsedTime % SysTicksPerSecond) * 100

Is this reliable? Anyone have something they have used?

Thanks!

Mike


Yes, this is what we are doing too and it's reliable as long as the device stays awake.

The result given by SysTicksPerSecond() will be constant on any given platform. It may vary between platforms, but that's actually the whole point. Some platforms count milliseconds, others count centiseconds, and the Mac simulator counts Mac OS ticks == 1/60 s. Note the that SysTicksPerSecond() is an OS function while sysTicksPerSecond is a compile-time macro.

Also, it seems to me that the emulator counts _emulated_ ticks, not wallclock time ticks, so if you break into the debugger the tick counter will simply freeze. Don't know if this is true when device debugging though. IIRC, I also got very similar results running benchmarks on a real M505 and on an emulated M505.

--Martin

--
For information on using the Palm Developer Forums, or to unsubscribe, please see 
http://www.palmos.com/dev/support/forums/

Reply via email to