A few dumb questions:

But first a quote from the ITU ( doc G.180 )

""4.1.12 (timing) jitter: The short-term variations of the significant instants 
of a timing signal from
their ideal positions in time (where short-term implies that these variations 
are of frequency greater
than or equal to 10 Hz).

4.1.15 wander: The long-term variations of the significant instants of a 
digital signal from their
ideal position in time (where long-term implies that these variations are of 
frequency less than
10 Hz).
NOTE – For the purposes of this Recommendation and related Recommendations, 
this definition does
not include wander caused by frequency offsets and drifts.""

DQ1. These both refer to phase variations, so with the exception of the 
frequency range specified, are they mathematically equivalent?

DQ2.  The note on wander excludes frequency offsets, but that is not specified 
for jitter, so do I have to include a frequency offset in jitter measurements? 
It seems to me that it make no sense to do so.

DQ3.  Can I deduce an underlying frequency offset from jitter (wander) by  
taking an RMS value over some window of values? 


regards,
Mike

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to