I'm chasing a weird timing problem.  LabVIEW 7.0 + Windows 2000

I have a VI (called "the Object"), whose purpose is to store and
retrieve numerical values by name.

The name is hashed, converted to an array index and the values are
read / written to an array slot.

That all works fine.


I have a tester for the Object.  It generates random names and random
values. starts a timer, stores the values (say 1000 of them),marks the
timer, reads the values, and marks the timer again.  The point is to
judge the execution time of the Object.

That all works fine.


EXCEPT


I measure a particular case at 11.44 mSec for 1000 writes.
Repetitions result in numbers similar to 11.44

I then load a particular VI from a program I'm having trouble with.
The trouble is it's taking more time than expected to process data.

This VI is normally set to execute at TIME CRITICAL priority.

This VI uses (indirectly) the Object.

If I load this VI WITHOUT RUNNING IT, and run the tester, the
execution time goes from 11.44 to 27.08 mSec.

The VI is NOT RUNNING, yet it's presence affects the execution time of
a VI it uses.

If I change its priority to NORMAL or BACKGROUND, then the tester
reports the shorter times (11.44 or so).

If I change its priority to ANYTHING above NORMAL, I get a 27+ mSec
time.  I can change it back to normal, and get the "normal" execution
time.

Why does a given VI's priority affect it's subVI's execution time IF
IT IS NOT EVEN RUNNING?

And how do I fix it?

Reply via email to