So far, I’ve been configuring my 53220A for frequency measurements with a 500 
msec gate time, and using the external reference and one input.

If instead I send the two devices into inputs A and B, and ask for the time 
interval between the two and give that to Timelab, my results look quite a bit 
worse.

At the moment, I’m doing that with a pair of 5680As. The ADEV at 100s is 
reasonably close to the spec at 1.83E-12, but the tau at 10s is what it’s 
*supposed* to be at 1s: 1.43E-11. At 1s, it’s 1.42E-10. The line is quite 
linear between those points, but the slope is way off the spec. The frequency 
difference graph supports this view - it shows a ±2E-10 “haze.”

I don’t have any reason to believe either oscillator is misbehaving to an 
extent that would explain this. I’m fairly sure I’m making some kind of 
fundamental newbie mistake and the test setup is introducing some sort of 
error, or I’m inside of the uncertainty of the 53220A and that’s why it’s 
showing poorly at low tau. My money is on the former. :)

The behavior I see suggests that how Timelab works with the 53220A is that it 
sends a command to obtain a single measurement over and over again. Thus, the 
network latency figures into the measurement timespan, I think. I’m sure 
there’s a way to record measurements in the 53220A internally and then export 
that file into Timelab, but I haven’t figured that out yet.
_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to