Here's an interesting problem.

You have a fast sampler that is collecting samples off-the-air (e.g. the end of LORAN) with a fairly wide bandwidth: say 10 Megasamples per second.

Those samples get post processed in a digital downconverter (not necessarily in real time) to a narrower band representation at a lower sample rate.

You know when the input samples were acquired: e.g. you've got a good oscillator, and a reliable sync pulse. For instance, your handy GPSDO (or the ensemble of H2 masers in your garage) might give you a 1pps tick good to, say, 20 ns, so you know when your 10 MHz samples were taken (to 20ns)

Is there a consistent (and standardized) way to calculate and report the time of the output samples.

Each output sample is composed of information from multiple input samples.

One could test the system by digitizing a signal with known timing (e.g. a 1 MHz sine wave, where the zero crossing is "on the second") and then look for the zero crossing in the downconverted output. Depending on the filtering in the downconverter, there's some time vs frequency characteristic that could be used to back out any deltas for other frequencies.

So you could report the time of the low rate output samples in terms of the time of the input sample, at least for the 'center frequency' of the downconverter.

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to