Hi Assuming you are trying to extract timing from the signal (time ticks on WWVB), the downconversion really does not matter. The ADC samples are what will “tag” your time data. If you are trying to extract frequency from the signal (you are after the center frequency of WWVB) then both the offset oscillator and the ADC clock will matter:
Your baseband tone is Fwwvb - Flo = Fif Your estimate of that tone is based on the frequency of the ADC samples. Bob > On Dec 23, 2017, at 9:46 AM, Stephan Sandenbergh <ssandenbe...@gmail.com> > wrote: > > Hi All, > > Consider the following very common scenario: A perfect RF signal is > heterodyne down-converted to baseband using an offset oscillator. Let's > assume this oscillator has x(t) = xo + yot. This produces a time and > frequency offset baseband signal. Then, this baseband signal is coherently > ADC sampled using that same offset oscillator. > > What would the effect of this coherent ADC sampling be? > > See attached diagram. Here I assumed the ADC timebase is a time-dependent > function of the oscillator offset. However, it feels like I'm making a > logic error? I can't remember ever seeing anyone accounting for the ADC > time-base errors in coherent heterodyne down-converter stages. I have > limited experience though. > > Regards, > > Stephan. > <img1.png>_______________________________________________ > time-nuts mailing list -- time-nuts@febo.com > To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts > and follow the instructions there. _______________________________________________ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.