On 4/13/18 1:39 PM, Achim Gratz wrote:
Jim Lux writes:
So now the challenge is to "line em up".  An obvious approach is to
transmit an inband pilot tone with some sync pattern, received by all,
and I'm working on that too.

A maybe not-so obvious approach would be to use RTL-SDR that have been
modified for direct sampling (usually via the Q branch) and inject your
timing pulse there.  That would limit the disturbance of the actual
signal while still relatively easy to extract from the data stream.

That's where it's being injected.. I'm using the RTL-SDR V.3, which has the RF input fed right to the Q input.



But right now, I have the idea of capacitively coupling the 1pps pulse
from the GPS to the antenna input - the fast rising and falling edge
are broad band and show up in the sampled data.

The trouble is that you are going to impair the already low dynamic
range.  The ENOB on the I/Q ADC is around 7bit only.

Well, so far, after DDC, it's coming out about 1/5th of the dynamic range, and I can always adjust the size of the capacitor.




And you can see, no surprise, that the sample clock in the RTL isn't
dead on - over the 10 seconds, it looks like it drifts about 30- 50
microseconds - that is, the RTL clock is slow by 3-5 ppm.

Not all of these are created equal.  Several manufacturers claim to
factory calibrate their TCXO to better than 0.5ppm.  I have currently
two RTL-SDR that certainly are within 1ppm.  These things get quite hot,
so it definitely takes some time before they stabilize even if they do
have a TCXO in them.

Could well be.. I just turned it on, waited for the beagle to boot, captured the data, and moved on.




SO here's the question for the time-nuts hive-mind...
What's a good (or not so good) way to develop an estimator of the
timing/frequency error. Post processing minutes of data is just fine..

There is a program called rtl_test that just checks how many samples it
gets in a certain amount of time.  Let it run for a few hours on a PC
with a GPS-disciplined PC clock and it'll give you a pretty accurate
estimate of the mean sampling clock deviation.

The other method is to tune to a signal of known frequency and check
what it reads as.  There is a program floating around that uses a GSM
station for that purpose.

I'm not so concerned about the frequency measurement - that's "easy".. What I'm interested is figuring out the precise timing (in absolute terms) of the samples.


_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to