> At this point the time measurement is quite crude, with 100-ns resolution. 
> But because we keep the counter running, the unknown residuals will keep
> accumulating, and we should be able to average out this "quantization noise"
> in the long run.  That is, we can measure any T-second period to within 100
> ns, so the resolution on a per-second basis becomes 100 ns / T.

No. The timing resolution per second is always 100 ns. You're probably thinking 
about average frequency, in which case dividing by T is sometimes valid, and it 
looks better and better as time goes by, usually.

What saves you here is that your counter noise (100 ns) is likely greater than 
the quantization noise. So you can pretty much ignore the receiver 1PPS 
quantization noise. For people with much lower measurement noise (e.g., 1 ns) 
the quantization noise becomes a more important piece of the error pie.

Try not to say average "out"; that sounds like it goes away over time or gets 
smaller. You're doing a timing measurement so the 100 ns measurement 
granularity is always there, on every measurement.

> Is there any reason why this sort of processing cannot attain equivalent
> performance to the more conventional analog phase-detection approach?

All other factors equal, a GPSDO based on 100 ns measurement resolution can 
never attain the equivalent of a GPSDO based on 10 ns or 1 ns measurement 
resolution. Waiting shorter or longer doesn't change the RMS timing accuracy.

/tvb

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to