On 10/29/16 10:14 PM, Tom Van Baak wrote:
One might expect that the actual ADEV value in this situation would be
exactly 1 ns at tau = 1 second.  Values of 0.5 ns or sqrt(2)/2 ns might not
be surprising. My actual measured value is about 0.65 ns, which does not
seem to have an obvious explanation.  This brings to mind various questions:

What is the theoretical ADEV value of a perfect time-interval measurement
quantized at 1 ns? What's the effect of an imperfect measurement
(instrument errors)? Can one use this technique in reverse to sort
instruments by their error contributions, or to tune up an instrument
calibration?

Hi Stu,

If you have white phase noise with standard deviation of 1 then the ADEV will 
be sqrt(3). This is because each term in the ADEV formula is based on the 
addition/subtraction of 3 phase samples. And the variance of normally 
distributed random variables is the sum of the variances. So if your standard 
deviation is 0.5 ns, then the AVAR should be 1.5 ns and the ADEV should be 0.87 
ns, which is sqrt(3)/2 ns. You can check this with a quick simulation [1].

Note this assumes that 1 ns quantization error has a normal distribution with 
standard deviation of +/- 0.5 ns. Someone who's actually measured the hp 5334B 
quantization noise can correct this assumption.


isn't the distribution of quantization more like a rectangular distribution (e.g. like an ADC). so variance of 1/12th?





_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to