On 28/02/14 14:18, Poul-Henning Kamp wrote:
In message <[email protected]>, Bob Camp writes:
To me the next layer here is to see if the basic accuracy of the device
can be improved in software.
I have a hard time seeing how that would happen.
I think one of the best chances would be to improve the phase
noise of the 200MHz signal.
But don't miss the fact that being able to make a LOT more measurements
in the same time also improves noise statistically.
What one possibly could do is to see if there is any round-off issues
that causes any noise. The interpolator does not have a number-magic
friendly gearing for decimal or binary numbers, unless you play some
magic with it. Rounding off causes the interpolator points to be
un-evenly distributed and by that adding a little noise to the measurement.
However, I would look at the 200 MHz systematics first. This was only to
show what you could possibly do to improve precision in software.
With a hotter CPU you can naturally do smarter auto-triggers and
auto-tunes and things like that.
Doing a CNT-90 like frequency estimator would indeed be possible and
provide better frequency measures.
Frequency drift estimator would maybe be a nice addition?
Cheers,
Magnus
_______________________________________________
time-nuts mailing list -- [email protected]
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.