> However, I am interested in knowing the time taken by some part of my
> labview code down to 100 microsecond resolution, rather than delay. Is
> it still impossible?

Ah.  A common technique listed in the other post is to use repetition so 
that a lower resolution timer can be used, then averaged to find the sub 
microsecond event.

Another technique is to use the high resolution windows timers as 
mentioned in this article

http://sine.ni.com/apps/we/niepd_web_display.DISPLAY_EPD4?p_guid=B45EACE3DE8556A4E034080020E74861&p_node=DZ52018&p_submitted=N&p_rank=&p_answer=&p_source=External

If the link doesn't work, search for high-precision timer or something 
similar.

Greg McKaskle


Reply via email to