Hello
All standard control theory books assume constant sample rates with no
jitter. This makes the maths solvable. This consistent treatment in
the books may lead to the assumption that jitter is bad. Jitter is
noise and noise is never good but is it significant? Probably not so
long as jitter is less than a sample period.
All the theory says that delay destabilizes control systems (with some
exceptions). So the first aim is to achieve a high sample rate.
If I was writing a software driver, I would be aiming to maximise the
sample rate first. I would be aiming to output the results of
calculations as soon as they are available and not necessarily at the
same time. I wouldn't worry about jitter.
Jitter might be important for feed-forward systems where errors (noise)
tend to get amplified with the potential to produce an unstable system.
So long as the feed-forward calculation is running at the same sample
speed as the PID loop, there shouldn't be too much of a problem.
In short, just because the theory assumes zero jitter, doesn't
necessarily mean that this is a practical requirement. I suggest you
focus on keeping the sample rate high and don't worry about jitter.
Regards
Darren Conway
Jon Elson wrote:
Bas Laarhoven wrote:
Ah, thanks John, you made it very clear! So in my situation the input
sampling moment is fixed (at the start of the thread) and only the
update of the output varies with the variations in processing time. You
state this will probably have no noticeable effect, but as this
variation is a significant part of the PID cycle time, wouldn't it be
better to reduce the variation (noise) and work with a fixed although
increased delay? Another option is to lower the loop update frequency as
that reduces the noise in more than one way. Do you know of a rule of
thumb for the cycle time in relation to the system response time ?
The "pros" usually have a setup so that a hardware clock or
interrupt timer both samples the encoder position and updates
the DAC velocity values at the same instant. The velocity
output is thus delayed until the NEXT servo cycle, but the
timing has very much reduced jitter. You might be able to
obtain similar result by holding your velocity update back in
the driver, and writing it to the device registers just after
reading the encoder data. Then, the computation would follow
these actions, and the result would be held until the next servo
cycle.
Jon
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Emc-developers mailing list
Emc-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/emc-developers
-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Emc-developers mailing list
Emc-developers@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/emc-developers