Hi Fred,

On 05/14/2011 01:02 PM, Tijd Dingen wrote:
Magnus Danielson wrote:
Notice that the pre-scaler is only used for higher frequencies.

Understood. I was just using the prescaler as an example for the "what if
if take every Nth edge".

Consider then the typical measurement setup:

A counter is set up to make a time interval measurement from channel A to channel B on each occurrence of a external arm trigger. Consider that a GPS provides a PPS pulse to the external arm input and a 10 MHz to the channel A. The DUT provides a 10 MHz to the channel B.

In this setup it will be 10 milion cycles on the channel A and B. This is not a problem for ADEV/AVAR. The tau will be that of 1 s or integer multiples thereof.

However, if you want a quality measure at 1 s then you better measure at a higher speed of say 1 kHz in order to get higher amount of data without having to way veeery long. Algorithmic improvements have been done to achieve higher quality quicker on the same data. Overlapping measures make fair use of data for shorter taus.

Notice that you need to adjust your data for cycle-slips. If you don't do that you will get a significant performance hit with typical several decades higher ADEV curve than expected.

Plus, I strongly suspect that all these commercial counters that can
handle 6 Ghz and such are not timestamping every single cycle
back-to-back either. Especially the models that have a few versions
in the series. One cheaper one that can handle 300 MHz for example,
and a more expensive one that can handle 6 GHz. That reads like:
"All models share the same basic data processing core and the same
time interpolators. For the more expensive model we just slapped on
an high bandwidth input + a prescaler."

You never time-stamp individual cycles anyway, so a pre-scaler doesn't do
much difference. It does limit the granularity of the tau values you use, but
usually not in a significant way since Allan variance is rarely used for taus
shorter than 100 ms and well... pre-scaling usually is below 100 ns so it
isn't a big difference.

Well, I can certainly /try/ to be able to timestamp individual cycles. ;) That 
way
I can for example characterize oscillator startup and such. Right now I can only
spit out a medium resolution timestamp every cycle for frequencies up to about
400 Mhz, and a high resolution timestamp every cycle for frequencies up to
about 20 MHz.

Medium resolution being on the order of 100 ps, and high resolution being on
the order of 10 ps. The medium resolution is possibly even a little worse than
that due to non-linearities, but there is still a few ways to improve that. Just
requires an aweful lot of design handholding to manually route parts of the
fpga design. I.e: "I will do that later. much much later". ;->

But understood, for Allan variance you don't need timestamps for every indivual
cycle.

No. Certainly not.

I do lack one rate in your discussion, your time-stamp rate, i.e. the maximum sample-rate you can handle, being limited to minimum time between two measurements. For instance, a HP5372A has a maximum sample rate of 10 MS/s in normal mode (100 ns to store a sample) while in fast mode it can do 13,33 MS/s (75 ns to store a sample). The interpolator uses a delay architecture to provide quick turn-around interpolation which gives only 200 ps resolution (100 ps resolution is supported in the architecture if only boards would be designed for it, so there is a hidden upgrade which never came about).

Do you mean to say that your low resolution time-stamping rate is 400 MS/s and high resolution time-stamping rate is 20 MS/s?

It is perfectly respectable to skip a number of cycles, but the number of cycles must be known. One way is to have an event-counter which is sampled, or you always provide samples at a fixed distance event-counter-wise such that the event-counter can be rebuilt afterwards. The later method save data, but have the draw-back that your observation period becomes dependent on the frequency of the signal which may or may not be what you want, depending on your application.

Recall, you will have to store and process this flood of data. For higher tau plots you will wade in sufficiently high amounts of data anyway, so dropping high frequency data to achieve a more manageable data rate in order to be able to store and process the longer tau data is needed.

For most of the ADEV plots on stability, starting at 100 ms or 1 s is perfectly useful, so a measurement rate of 10 S/s is acceptable.

For high speed things like startup burps etc. you have a different requirement race. A counter capable of doing both will be great, but they usually don't do it.

Anyways, any drawbacks to calculating Allan Variance of a divided signal
that I am overlooking here?

No significant, it adds to the noise floor, but in practice the time-stamping 
and
processing doesn't have big problems due to it.

Precisely what I was hoping for, thanks! :)

Great.

Cheers,
Magnus

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to