Re: [Flexradio] Frequency Accuracy
At 09:23 AM 7/5/2005, John Ackermann N8UR wrote: Jim Lux wrote: > At 07:22 AM 7/5/2005, Frank Brickle wrote: > >> Jim Lux wrote: >> > There you go, then... Fire up the spotting tone at some convenient > frequency, run the audio output from the Delta 44 into your counter, > and directly measure the clock rate. The ideal frequency would depend > on your type of counter (straight gate or ratiometric, for instance). > I'll do that (as well as measuring a recorded signal to bypass any contribution of the output process).. But I still don't understand what was wrong with the original test. The question I was trying to answer was how much the soundcard contributed to the frequency accuracy of the SDR1000 system. The answer I got is based on exactly what happens in real use -- with the radio tuned to X.XXMHz (using an external DDS reference to reduce that error essentially to zero), a signal at X.XXMHz should yield a 600Hz output tone. If it doesn't, there's an error. That's what I was trying to quantify. While the actual sampling rate of the sound card is interesting, what's more important is its impact on the receiver system as a whole. Yes, it does tell you what the error in the system as a whole is, but doesn't tell you where you could or could not make any improvements. For instance, if you get a 0.1 Hz error, is that an additive error (all frequencies are off by 0.1 Hz) or a percentage error (in which case the error might vary as a function of band). Even more telling is if the error varies in some systematic way. Naturally, if all you're doing is SSB voice or even CW, a frequency error of <1Hz is totally academic. And, most digital modes these days (i.e. PSK31) are "self tuning". However, if you want to use your SDR in the ARRL Freq Measuring Test, or, you want to use your SDR for HF propagation studies, or, you want to use your SDR as part of a multiport vector network analyzer, or, you're interested in connecting multiple SDRs together in a phased array, then these sorts of things get more important. My experience has been that it's actually probably better to understand the error model (that is, what clocks/oscillators affect which measured values) than the actual error per-se. Any precision measurement application is going to have to be calibrated anyway, and knowledge of the error model is needed to develop the calibration approach. For instance, I use the SDRs in a direct conversion to baseband application (i.e. I don't do a second conversion in software), with subsequent digital processing of the digitized data. I'm measuring relative phases of several signals all in the (audio) passband. For me, knowing the clock rate is very important, and is, in fact, the dominant source of error. In a looped through application (such as the usual SDR application) the clock rate is less important, except as it affects the frequency of the CW tone, or an offset in SSB voice. Clearly, we can tolerate huge frequency offsets in SSB voice, so tiny fractional changes are insignificant. I'd expect even crummy clock oscillators on sound cards to be accurate to tenths of a percent, corresponding to a few Hz. This discussion has been useful, because I had forgotten that the standard SDR software does a second down conversion from 11kHz to take out the offset resulting from choosing relatively spur free DDS frequencies. By the way, Bob -- thanks for noting the spur reduction issue. For my tests, I had spur reduction on as that's the way the radio is typically used. John James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
Jim Lux wrote: > At 07:22 AM 7/5/2005, Frank Brickle wrote: > >> Jim Lux wrote: >> >>> Seems that a better way to measure clock accuracy on the sound card >>> is to generate a sine wave in software and run it out to the >>> (external reference locked) counter. >> >> >> The spotting tone function in the DSP will generate a continuous >> sine, at an arbitrary frequency, with the CORDIC oscillator function >> used everywhere in the system. > > > > There you go, then... Fire up the spotting tone at some convenient > frequency, run the audio output from the Delta 44 into your counter, > and directly measure the clock rate. The ideal frequency would depend > on your type of counter (straight gate or ratiometric, for instance). > I'll do that (as well as measuring a recorded signal to bypass any contribution of the output process).. But I still don't understand what was wrong with the original test. The question I was trying to answer was how much the soundcard contributed to the frequency accuracy of the SDR1000 system. The answer I got is based on exactly what happens in real use -- with the radio tuned to X.XXMHz (using an external DDS reference to reduce that error essentially to zero), a signal at X.XXMHz should yield a 600Hz output tone. If it doesn't, there's an error. That's what I was trying to quantify. While the actual sampling rate of the sound card is interesting, what's more important is its impact on the receiver system as a whole. By the way, Bob -- thanks for noting the spur reduction issue. For my tests, I had spur reduction on as that's the way the radio is typically used. John
Re: [Flexradio] Frequency Accuracy
At 07:22 AM 7/5/2005, Frank Brickle wrote: Jim Lux wrote: Seems that a better way to measure clock accuracy on the sound card is to generate a sine wave in software and run it out to the (external reference locked) counter. The spotting tone function in the DSP will generate a continuous sine, at an arbitrary frequency, with the CORDIC oscillator function used everywhere in the system. There you go, then... Fire up the spotting tone at some convenient frequency, run the audio output from the Delta 44 into your counter, and directly measure the clock rate. The ideal frequency would depend on your type of counter (straight gate or ratiometric, for instance). James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
Jim Lux wrote: Seems that a better way to measure clock accuracy on the sound card is to generate a sine wave in software and run it out to the (external reference locked) counter. The spotting tone function in the DSP will generate a continuous sine, at an arbitrary frequency, with the CORDIC oscillator function used everywhere in the system. 73 Frank AB2KT
Re: [Flexradio] Frequency Accuracy
At 10:49 PM 7/4/2005, Lyle Johnson wrote: ...If you're looking at the analog output of the sound card, which is essentially the "looped back" analog input to the card, then the sampling rate may not make any difference. Keep in mind that the SDR-1000 is not baseband in, baseband out, but uses an IF of 11 to 15 kHz depending on the exact frequency it is tuned to and if spur reduction is on or off. Thus, the sound card oscillator is used to derive an oscillator at 11 to 15 kHz, which mixes the quadrature signal down to baseband. This is what John is attempting to quantify. OK, then.. That makes the measurement methodology somewhat more complex. Seems that a better way to measure clock accuracy on the sound card is to generate a sine wave in software and run it out to the (external reference locked) counter. No hassles with the SDR, etc. One could also use the SDR as a precision frequency audio source (albeit with spurs and phase noise contributions) to validate that the A/D samples at the same frequency as the D/A samples. When I was characterizing the AC97 codecs on my Via Mini-ITX mobos, I just created a big file with a sine wave in it and played it back with APLAY (on a Linux box), then ran that to the measurement system. You're always better off to measure pieces than combinations. Then you can measure the combination and see if the uncertainties of the pieces combine like they should. This would be a good way to find subtle signal processing software bugs such as dropping samples. i.e. the frequency plan model is fairly straightforward, so you should be able to predict what's going on, and if it's different, then you know there's a problem. A subtle bug might be hiccups in the OS pushing the data in and out of the sound card interface. I don't know how big the buffering is, but since Windows isn't hard Real Time, it's not inconceivable that there might be a buffer overrun or underrun because the kernel got busy doing something else (handling mouse clicks or network I/O), and, because it's random, you might not see it. Certainly, Windows isn't going to reliably tell you about it, because their (consumer) orientation is towards audible defects, and dropping a sample every second or so isn't going to be audible, especially if you "catch up" so that overall the sound stays synchronized. I'd also be interested to know if there's a variable latency between input and output. That is, is the A/D and D/A clock synchronized, or at least, in a phase stable relationship. If the SDR-1000 detector output were at baseband, then the sampling rate accuracy would indeed be irrelevant. Except that any jitter on the sampling still contributes to noise levels. 73, Lyle KK7P James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
At 03:27 AM 7/5/2005, Robert McGwier wrote: Do not forget that you can have two different conditions in your settings that will impact your analysis. spur reduction on, spur reduction off. spur reduction turned off, the last conversion IN THIS DUAL CONVERSION SUPERHETERODYNE RX is done by the software oscillator is -11025 Hz. When spur reduction is turned on, we find the closest most spur free DDS setting, and then tune the rest of the offset by -11025 - (desired frequency - DDSFreq). This implies that in both these settings, the sound card, which is the clock for this software oscillator, imposes a more complex relationship than seems to be implied here so far. yes indeedy.. For the SDR in, offset oscillator, audio out path, fout = factualsample*( fin/factualsample +fidealoffset/fidealsample) (I think.. I haven't had my coffee yet) A way to eliminate the software oscillator is to out the RADIO is SPEC mode, and turn spur reduction off. The software oscillator is then set to 0 and is out of the circuit. Bob James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
Do not forget that you can have two different conditions in your settings that will impact your analysis. spur reduction on, spur reduction off. spur reduction turned off, the last conversion IN THIS DUAL CONVERSION SUPERHETERODYNE RX is done by the software oscillator is -11025 Hz. When spur reduction is turned on, we find the closest most spur free DDS setting, and then tune the rest of the offset by -11025 - (desired frequency - DDSFreq). This implies that in both these settings, the sound card, which is the clock for this software oscillator, imposes a more complex relationship than seems to be implied here so far. A way to eliminate the software oscillator is to out the RADIO is SPEC mode, and turn spur reduction off. The software oscillator is then set to 0 and is out of the circuit. Bob John Ackermann N8UR wrote: Jim Lux wrote: Consider two RF carriers, at 10.001 and 10.002 MHz If the DDS is perfect, at 10 MHz, and the sampler is perfect, at, say, 10 kHz, then you'll get two sine waves in the digitized sequence. One at 10 samples per cycle (the 1kHz audio) and the other at 5 samples per cycle (the 2 kHz audio). The ratio between the two will be 1:2 If the DDS is off, but the sampler is perfect, then, both RF frequencies will be shifted by the same amount. Say the DDS is at 9.999 MHz (a kHz low). the two audio frequencies will be 2 and 3 kHz, instead of 1 and 2 kHz, so your sampled data stream will have a 5 samples/cycle (the 2 kHz) and a 3.33 samples/cycle (the 3 kHz). The ratio is nolonger 1:2 but something else (5:3.33) If the DDS is perfect, but the sampler is slow (say at 9kHz, instead of 10 kHz), then you'll get two signals at 1 and 2 kHz, but the sampled data stream will have a tone at 9 samples/cycle and one at 4.5 samples/cycle. The ratio is 1:2, but the actual value is different. The effect is the same as the difference between playing a tape fast or slow (which preserves the harmonic relations, even if the pitch changes) and tuning high or low with SSB (which does not). And this is where I'm confused. I'm not (for this experiment) looking at the linearity of the passband, but rather the absolute accuracy of the frequency transformation. The change in pitch is what I'm measuring -- if everything is perfect, I know that an input precisely on the frequency the radio is tuned to will yield an output (audio tone) that's precisely 600Hz. If the sampling rate is off, the 600Hz tone will be off, which translates into a frequency error -- if the tone is 599Hz, that's the same as the radio being tuned 1Hz low in frequency. That's the frequency error I'm trying to measure. John ___ FlexRadio mailing list FlexRadio@flex-radio.biz http://mail.flex-radio.biz/mailman/listinfo/flexradio_flex-radio.biz
Re: [Flexradio] Frequency Accuracy
...If you're looking at the analog output of the sound card, which is essentially the "looped back" analog input to the card, then the sampling rate may not make any difference. Keep in mind that the SDR-1000 is not baseband in, baseband out, but uses an IF of 11 to 15 kHz depending on the exact frequency it is tuned to and if spur reduction is on or off. Thus, the sound card oscillator is used to derive an oscillator at 11 to 15 kHz, which mixes the quadrature signal down to baseband. This is what John is attempting to quantify. If the SDR-1000 detector output were at baseband, then the sampling rate accuracy would indeed be irrelevant. 73, Lyle KK7P
Re: [Flexradio] Frequency Accuracy
At 06:03 PM 7/4/2005, John Ackermann N8UR wrote: Jim Lux wrote: > > > Consider two RF carriers, at 10.001 and 10.002 MHz > > > changes) and tuning high or low with SSB (which does not). And this is where I'm confused. I'm not (for this experiment) looking at the linearity of the passband, but rather the absolute accuracy of the frequency transformation. The change in pitch is what I'm measuring -- if everything is perfect, I know that an input precisely on the frequency the radio is tuned to will yield an output (audio tone) that's precisely 600Hz. If the sampling rate is off, the 600Hz tone will be off, which translates into a frequency error -- if the tone is 599Hz, that's the same as the radio being tuned 1Hz low in frequency. That's the frequency error I'm trying to measure. But how are you measuring it? By looking at a .WAV file (or equivalent) with the digitized data, or by looking at an analog signal generated by the sound card? If the former (or, the equivalent, looking for the peak in the spectrum display), then you are measuring sound card sampling rate error. If you're looking at the analog output of the sound card, which is essentially the "looped back" analog input to the card, then the sampling rate may not make any difference. John James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
Jim Lux wrote: > > > Consider two RF carriers, at 10.001 and 10.002 MHz > > If the DDS is perfect, at 10 MHz, and the sampler is perfect, at, say, > 10 kHz, then you'll get two sine waves in the digitized sequence. One > at 10 samples per cycle (the 1kHz audio) and the other at 5 samples > per cycle (the 2 kHz audio). The ratio between the two will be 1:2 > > If the DDS is off, but the sampler is perfect, then, both RF > frequencies will be shifted by the same amount. Say the DDS is at > 9.999 MHz (a kHz low). the two audio frequencies will be 2 and 3 kHz, > instead of 1 and 2 kHz, so your sampled data stream will have a 5 > samples/cycle (the 2 kHz) and a 3.33 samples/cycle (the 3 kHz). The > ratio is nolonger 1:2 but something else (5:3.33) > > If the DDS is perfect, but the sampler is slow (say at 9kHz, instead > of 10 kHz), then you'll get two signals at 1 and 2 kHz, but the > sampled data stream will have a tone at 9 samples/cycle and one at 4.5 > samples/cycle. The ratio is 1:2, but the actual value is different. > > The effect is the same as the difference between playing a tape fast > or slow (which preserves the harmonic relations, even if the pitch > changes) and tuning high or low with SSB (which does not). And this is where I'm confused. I'm not (for this experiment) looking at the linearity of the passband, but rather the absolute accuracy of the frequency transformation. The change in pitch is what I'm measuring -- if everything is perfect, I know that an input precisely on the frequency the radio is tuned to will yield an output (audio tone) that's precisely 600Hz. If the sampling rate is off, the 600Hz tone will be off, which translates into a frequency error -- if the tone is 599Hz, that's the same as the radio being tuned 1Hz low in frequency. That's the frequency error I'm trying to measure. John
Re: [Flexradio] Frequency Accuracy
At 02:36 PM 7/4/2005, John Ackermann N8UR wrote: Jim Lux wrote: > > So this uses the SDR1000 as a audio frequency generator, right? The > Delta 44's not in the picture at this point, or if it is, it's > basically digitizing and playing back at the same sample rate. > Hi Jim -- I'm not sure I follow. If the soundcard is not clocking at the rate the DSP software assumes it is, the software's understanding about what frequency it's dealing with will be off by that error. Yes, but it's a multiplicative error, as opposed to an additive error (the error in the LO frequency would be an additive error) Consider two RF carriers, at 10.001 and 10.002 MHz If the DDS is perfect, at 10 MHz, and the sampler is perfect, at, say, 10 kHz, then you'll get two sine waves in the digitized sequence. One at 10 samples per cycle (the 1kHz audio) and the other at 5 samples per cycle (the 2 kHz audio). The ratio between the two will be 1:2 If the DDS is off, but the sampler is perfect, then, both RF frequencies will be shifted by the same amount. Say the DDS is at 9.999 MHz (a kHz low). the two audio frequencies will be 2 and 3 kHz, instead of 1 and 2 kHz, so your sampled data stream will have a 5 samples/cycle (the 2 kHz) and a 3.33 samples/cycle (the 3 kHz). The ratio is nolonger 1:2 but something else (5:3.33) If the DDS is perfect, but the sampler is slow (say at 9kHz, instead of 10 kHz), then you'll get two signals at 1 and 2 kHz, but the sampled data stream will have a tone at 9 samples/cycle and one at 4.5 samples/cycle. The ratio is 1:2, but the actual value is different. The effect is the same as the difference between playing a tape fast or slow (which preserves the harmonic relations, even if the pitch changes) and tuning high or low with SSB (which does not). Think about the extreme case -- the soundcard is sampling at 24ksamples, while the software processing the bitstream thinks the sample rate is 48ksamples (forget about underrun problems for the moment). That's going to result in a 1kHz input being presented to the DSP layer as a 2kHz signal. That will result in all sorts of possible errors as the signal is downconverted and otherwise processed. (Remember, there is a digital downconversion in the DSP layer; the input to the soundcard is the IF at 11.025kHz). We're talking about much smaller errors here (maybe parts in 10e5), but the impact is the same -- an error in sample rate will cause the DSP code to think the frequency is slightly different than it actually is, and that will result in an erroneous output signal. Yes... but the test described, I think, took the digitized signal and then turned it back into an analog signal for counting, so the sampling rate error would cancel. Example, if a 1kHz tone is sampled at 10 kHz, for 10 samples/cycle, and then D/Ad at 10 khz, it will still be 1kHz. If the same tone is sampled at 9.5 kHz, you'll get 9.5 samples/cycle, but if you D/A it back at the same 9.5 kHz, the output will still be 1kHz. Most of the signal processing in the SDR 1000 is linear (as in FFTs, filters, etc.), and so will be sample rate independent (there's no harmonic content being added). The fact that we're using the same soundcard (and therefore the same clock) for both sampling and playback complicates the picture, but since the playback frequency is digitally mixed down from the input frequency, the two operations aren't likely to cancel each other out because the error will scale across that 20 to one frequency range. You raise an interesting point. If there's a digital mixing process in the PC, then you could get a frequency error between input and output. Example. 3 kHz input, 10 kHz sample rate, mixed with a 1 kHz, to shift the 3 kHz down to 2 kHz, then shot back out at 10 kHz sampling rate. (I'm assuming analytic (complex) signals here). If you had an error, say 9 kHz sampling rate, then the mix frequency would be 900 Hz, and the output would be at 2.1 kHz. Or am I missing something? I'm not a DSP (or for that matter, any kind of) engineer, so I'm sure I haven't used the right terminology, but hopefully you can understand what I'm trying to say. John James Lux, P.E. Spacecraft Radio Frequency Subsystems Group Flight Communications Systems Section Jet Propulsion Laboratory, Mail Stop 161-213 4800 Oak Grove Drive Pasadena CA 91109 tel: (818)354-2075 fax: (818)393-6875
Re: [Flexradio] Frequency Accuracy
Jim Lux wrote: > > So this uses the SDR1000 as a audio frequency generator, right? The > Delta 44's not in the picture at this point, or if it is, it's > basically digitizing and playing back at the same sample rate. > Hi Jim -- I'm not sure I follow. If the soundcard is not clocking at the rate the DSP software assumes it is, the software's understanding about what frequency it's dealing with will be off by that error. Think about the extreme case -- the soundcard is sampling at 24ksamples, while the software processing the bitstream thinks the sample rate is 48ksamples (forget about underrun problems for the moment). That's going to result in a 1kHz input being presented to the DSP layer as a 2kHz signal. That will result in all sorts of possible errors as the signal is downconverted and otherwise processed. (Remember, there is a digital downconversion in the DSP layer; the input to the soundcard is the IF at 11.025kHz). We're talking about much smaller errors here (maybe parts in 10e5), but the impact is the same -- an error in sample rate will cause the DSP code to think the frequency is slightly different than it actually is, and that will result in an erroneous output signal. The fact that we're using the same soundcard (and therefore the same clock) for both sampling and playback complicates the picture, but since the playback frequency is digitally mixed down from the input frequency, the two operations aren't likely to cancel each other out because the error will scale across that 20 to one frequency range. Or am I missing something? I'm not a DSP (or for that matter, any kind of) engineer, so I'm sure I haven't used the right terminology, but hopefully you can understand what I'm trying to say. John
Re: [Flexradio] Frequency Accuracy
At 12:13 PM 7/4/2005, John Ackermann N8UR wrote: Following up the conversation a week or two ago about frequency accuracy in the SDR-1000, I did a test today that may shed some light on the contribution of the soundcard clock to the error budget. I used a Delta 44 that I got from Gerald a couple of weeks ago, so it's recent production. I used beta 1.3.13. The short story is that I measured the tone coming out of the radio when receiving a CW carrier on a known frequency to see how far it deviates from the nominal 600Hz offset. The answer is that the error was about 0.27Hz, and was not frequency dependent -- it was essentially the same on 160M as on 10M. The sign of the error changed between CW-L and CW-U, but the amount remained the same. So, the bottom line is that the Delta 44 has a pretty accurate clock, and may contribute much less than 1Hz of frequency error. And, the error is fixed and doesn't scale with frequency, but does invert with sideband. (Of course, this is based on a sample size of one card, but some time ago I measured another, older, Delta 44 using a different technique and came up with similar results, so I suspect these are typical results.) More details on the test: My SDR-1000 used an HP 5065A Rubidium frequency standard as its external reference. The Rb is known to be within about 1 part in 10e12 versus GPS. (that's an error of 1uHz at 1MHz). I injected an accurately known frequency into the SDR-1000, put the radio in CW-U mode with a narrow filter, and then read the audio output frequency on a high-resolution counter. The input signal was about -70dBm and came from a low-noise synthesizer with no (intentional) modulation. The synthesizer was referenced to an HP Z3801A GPS disciplined oscillator, as was the counter. The Z3801A is also normally within about 1 part in 10e12 of GPS. So this uses the SDR1000 as a audio frequency generator, right? The Delta 44's not in the picture at this point, or if it is, it's basically digitizing and playing back at the same sample rate. What if you take the audio output from the SDR1000 and run it into the counter directly? You could count either L or R (I or Q), they should both be at the same frequency (i.e. the difference between your RF input and whatever the DDS is set to). The actual frequency will be different than the "dial frequency" because of the logic that selects DDS frequencies that have low spurs, of course. I'm assuming that the signal chain is RF shifted to baseband by DDS LO, digitized (at Delta 44 clock rate), filtered, converted to analog (at same Delta 44 clockrate), then counted. Errors in the Delta 44 rate will change the center frequency of the filter, but not the frequency of the output vs input. There might be some variability because the software digitizes in batches, then filters in a batch, then D/As in batches, so the frequency at which the final playback occurs might be a bit different than the frequency at which the digitizing was done at. At 1.801MHz, using CW-U and a 25Hz filter, the audio output was 599.712Hz. At 28.601MHz, with the same conditions, the audio output was 599.739Hz. In both cases, I averaged for 10,000 single-gate samples. The standard deviation in each case was about 0.3Hz. How long was the counter gate interval? If your counter is a zero deadtime counter, what you're basically doing is measuring the Allan deviation (i.e. the SD of 0.3 Hz out of 600 Hz is 0.0005 Allan Deviation. You'd square for approx Allan variance) Averaging the two results and applying a bit of windage, the offset is about 0.27Hz. Changing to CW-L gave the same error, but the tone was slightly above, rather than below, 600Hz. All the gear, including the PC and Delta 44, had been powered up for a couple of days, so should have been thermally stable. The ambient room temperature was around 73 degrees. A better way to separate the two effects would be to transmit two RF signals at a precise difference (of some few kHz), and then look at the digitized output file from the audio card. This would allow you to separate DDS LO frequency effects (which will translate both frequencies by the same amount) and sampling rate effects (which will have a multiplicative effect). A handy way to do this is to look at an AM signal with a precision tone, i.e. WWV(WWVH). The AM detected tone (i.e. the envelope) is independent of the RF carrier frequency. Of course, you have ionospheric problems with WWV, although the 60 kHz WWVB might work. If you have a good signal generator that can generate the modulation locked to your same frequency standard, you could do it that way. I don't know, off hand, whether signal generators like the 8640B generate the modulation referenced to the same external source. At work, I use things like a 3325 (which can take an external 10 or 1 MHz reference) to generate the modulation for a higher frequency source (like an 8663), or, f
Re: [Flexradio] Frequency Accuracy
ecellison wrote: > However, there is so much going on in the SDR-1000 and PowerSDR software >world there are too many projects all at once! It may take till the end of >the year to get a lot done on this. I am gathering hardware and looking >forward to the Reflock 2, but would like to see it run native at 200 mhz to >supply to the SDR. Could also use a DDS at 200 mhz, corrected at 200 mhz. >Will need some USB glue pieces to get it all together with the project that >Phil - VK6APH and others are working on. > > I should have reported this earlier, sorry -- in theory, the Reflock II should work at 200MHz without any problems; of course we'll need to find a decent 200MHz VCXO for it to drive. John
RE: [Flexradio] Frequency Accuracy
John That really is good news! I am still studying this, and reading. You guys got me hooked. I think a really accurate long and short term low phase shift device is do-able at a very reasonable cost < 200 bux on a 'put it together yourself' budget. However, there is so much going on in the SDR-1000 and PowerSDR software world there are too many projects all at once! It may take till the end of the year to get a lot done on this. I am gathering hardware and looking forward to the Reflock 2, but would like to see it run native at 200 mhz to supply to the SDR. Could also use a DDS at 200 mhz, corrected at 200 mhz. Will need some USB glue pieces to get it all together with the project that Phil - VK6APH and others are working on. Thanks for keeping the thread and idea alive! Eric -Original Message- From: [EMAIL PROTECTED] [mailto:[EMAIL PROTECTED] On Behalf Of John Ackermann N8UR Sent: Monday, July 04, 2005 3:14 PM To: FlexRadio Reflector Subject: [Flexradio] Frequency Accuracy Following up the conversation a week or two ago about frequency accuracy in the SDR-1000, I did a test today that may shed some light on the contribution of the soundcard clock to the error budget. I used a Delta 44 that I got from Gerald a couple of weeks ago, so it's recent production. I used beta 1.3.13. The short story is that I measured the tone coming out of the radio when receiving a CW carrier on a known frequency to see how far it deviates from the nominal 600Hz offset. The answer is that the error was about 0.27Hz, and was not frequency dependent -- it was essentially the same on 160M as on 10M. The sign of the error changed between CW-L and CW-U, but the amount remained the same. So, the bottom line is that the Delta 44 has a pretty accurate clock, and may contribute much less than 1Hz of frequency error. And, the error is fixed and doesn't scale with frequency, but does invert with sideband. (Of course, this is based on a sample size of one card, but some time ago I measured another, older, Delta 44 using a different technique and came up with similar results, so I suspect these are typical results.) More details on the test: My SDR-1000 used an HP 5065A Rubidium frequency standard as its external reference. The Rb is known to be within about 1 part in 10e12 versus GPS. (that's an error of 1uHz at 1MHz). I injected an accurately known frequency into the SDR-1000, put the radio in CW-U mode with a narrow filter, and then read the audio output frequency on a high-resolution counter. The input signal was about -70dBm and came from a low-noise synthesizer with no (intentional) modulation. The synthesizer was referenced to an HP Z3801A GPS disciplined oscillator, as was the counter. The Z3801A is also normally within about 1 part in 10e12 of GPS. At 1.801MHz, using CW-U and a 25Hz filter, the audio output was 599.712Hz. At 28.601MHz, with the same conditions, the audio output was 599.739Hz. In both cases, I averaged for 10,000 single-gate samples. The standard deviation in each case was about 0.3Hz. Averaging the two results and applying a bit of windage, the offset is about 0.27Hz. Changing to CW-L gave the same error, but the tone was slightly above, rather than below, 600Hz. All the gear, including the PC and Delta 44, had been powered up for a couple of days, so should have been thermally stable. The ambient room temperature was around 73 degrees. 73, John ___ FlexRadio mailing list FlexRadio@flex-radio.biz http://mail.flex-radio.biz/mailman/listinfo/flexradio_flex-radio.biz
[Flexradio] Frequency Accuracy
Following up the conversation a week or two ago about frequency accuracy in the SDR-1000, I did a test today that may shed some light on the contribution of the soundcard clock to the error budget. I used a Delta 44 that I got from Gerald a couple of weeks ago, so it's recent production. I used beta 1.3.13. The short story is that I measured the tone coming out of the radio when receiving a CW carrier on a known frequency to see how far it deviates from the nominal 600Hz offset. The answer is that the error was about 0.27Hz, and was not frequency dependent -- it was essentially the same on 160M as on 10M. The sign of the error changed between CW-L and CW-U, but the amount remained the same. So, the bottom line is that the Delta 44 has a pretty accurate clock, and may contribute much less than 1Hz of frequency error. And, the error is fixed and doesn't scale with frequency, but does invert with sideband. (Of course, this is based on a sample size of one card, but some time ago I measured another, older, Delta 44 using a different technique and came up with similar results, so I suspect these are typical results.) More details on the test: My SDR-1000 used an HP 5065A Rubidium frequency standard as its external reference. The Rb is known to be within about 1 part in 10e12 versus GPS. (that's an error of 1uHz at 1MHz). I injected an accurately known frequency into the SDR-1000, put the radio in CW-U mode with a narrow filter, and then read the audio output frequency on a high-resolution counter. The input signal was about -70dBm and came from a low-noise synthesizer with no (intentional) modulation. The synthesizer was referenced to an HP Z3801A GPS disciplined oscillator, as was the counter. The Z3801A is also normally within about 1 part in 10e12 of GPS. At 1.801MHz, using CW-U and a 25Hz filter, the audio output was 599.712Hz. At 28.601MHz, with the same conditions, the audio output was 599.739Hz. In both cases, I averaged for 10,000 single-gate samples. The standard deviation in each case was about 0.3Hz. Averaging the two results and applying a bit of windage, the offset is about 0.27Hz. Changing to CW-L gave the same error, but the tone was slightly above, rather than below, 600Hz. All the gear, including the PC and Delta 44, had been powered up for a couple of days, so should have been thermally stable. The ambient room temperature was around 73 degrees. 73, John