Fellow time-nuts,

As David insisted that I get and then read the ITU Handbook Selection and Use of Precise Frequency and Time Systems (1997) and in particular Chapter 3 I took the time to get it and start reading it. In there I found clause 3.3.2.4.4 Truncation effects, which addresses this issue, which also aligns up with my own writing on Allan Deviation, and the Measurement bandwidth limit (I will have to update that one).

The key point is that the main lobe of the kernel function (the way that the sin(pi*tau*f)^4/x^n look), will be affected by the system bandwidth and values will be not matching up to the brick-wall analysis of the traditional system. The result will be that the ADEV measure will be lower than it should. This situation was analysed by Bernier in 1987 as part of analysing the modified Allan deviation, which has a "software bandwidth filter" in the form of the n*tau_0 average filter.

So, the first low n values is even expected to give systematic low values, which is the reason for the ITU-T to put minimum requirements on the tau_0 to lowest tau to ensure that repeatability is achieved.

This is also the same effect that Sam Steiner mentioned in his presentation during this years NIST seminars. Sam also went on to discuss the effect of aliasing, which helps to bring even more false values in that region.

Conclusion: Just don't look all that hard on the lower tau values, as they can be systematically off. Make sure that you have a tau_0 well below the taus you are interested in to ensure that your values is reasonably valid.

Cheers,
Magnus

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to