On 02/12/2008, at 5:08 PM, Daniel Luis dos Santos wrote:
Hello,
Not sure I should ask this here.
How do I get time measurements in milliseconds? What is the accuracy
of the mac's C library implementation ?
I am using the clock() function from time.h and measuring
differences in seconds between the trigering of a NSTimer.
The NSTimer fires every 0.1 secs, and in the event code I measure
the difference between the current clock() call and the previous. I
then divide it by the CLOCKS_PER_SEC constant and get two orders of
magnitude difference than the interval in the timer. I get 0.001
secs instead of the 0.1 of the NSTimer.
Any ideias ?
You can use the Microseconds() call to get microsecond-accurate
timing. It's declared in CoreServices/Timer.h.
This page has some handy info on making use of it:
http://www.meandmark.com/timingpart2.html
--
Rob Keniger
_______________________________________________
Cocoa-dev mailing list (Cocoa-dev@lists.apple.com)
Please do not post admin requests or moderator comments to the list.
Contact the moderators at cocoa-dev-admins(at)lists.apple.com
Help/Unsubscribe/Update your Subscription:
http://lists.apple.com/mailman/options/cocoa-dev/archive%40mail-archive.com
This email sent to [EMAIL PROTECTED]