My SDR1K arrived two weeks ago, and I've spent the last week playing with it
and, in the process, have made a few observations regarding its performance
and operation.  Others may be interested in what I observed...

First, I was surprised to observe that, after nulling the received image
down to the noise floor, the image popped back up if I tuned off-frequency a
bit.  So I ran a few experiments:

Experiment 1. Null the image at a given frequency, and then, with the
receiver tuned to the null frequency (centered in the display) and without
changing the receiver's frequency, change the generator's frequency. If
re-nulling is required, note the new Phase and Gain values. 
I set my generator at 19 MHz and nulled the image at 19.02205 MHz (the "spur
reduction" box must be *unselected*- otherwise, the image could be at some
other frequency). Then, while monitoring 19.02205 MHz, I changed the
frequency of my generator both above and below 19 MHz in 1 KHz steps. 
The results surprised me. As I moved the generator off of 19 MHz by *only* 1
KHz, the image returned, and it became larger and larger the further I moved
the generator off frequency (by only a few KHz) in either direction. 
If I renulled the image manually (keeping the receiver at 19.02205 MHz), I
had to change the phase by larger and larger amounts (and these were
significant amounts!) as I moved the image further and further off frequency
in either direction. 
Experiment 2. Null the image at a given frequency and note the Phase and
Gain values. Shift the generator by 100 KHz and renull the image, using the
same frequency offset to center the null in the display (22.05 KHz for my
receiver - note that "spur reduction" must be unselected). Note the new
Phase and Gain values. 
I set my generator at 19 MHz and nulled the image at 19.02205 MHz. Then I
set my generator to 18.9 MHz and nulled the image at 18.92205 MHz, making
sure that the image was properly centered on my display each time. 
Interestingly, the phase and gain settings were very close to each other for
the two frequencies, and much closer to each other than the values I had
measured in experiment 1, in which I moved the generator frequency by only 1
KHz, rather than 100 KHz. 
>From the results of experiement 1, I'm guessing that the PowerSDR software
phase-nulling algorithm does a simple time-delay shift when you adjust the
"phase" control, to reduce the received spur.  Adjusting a time-delay
between the I & Q channels will create a great null (into the noise floor)
at that single audio frequency, but the null will immediately degrade as one
moves off the nulled frequency, and I've measured it degrade by 20 dB (or
more) worse at +/- 10 KHz (making the image rejection on the order of 50- 60
dB).


Some other observations:

1.  Slightly nudging or bumping the 3.5mm plug inserted into the SDR1K's "To
Line In" connector can dramatically change the Receiver's image rejection,
and it often does not return to its original value after the nudge.  A more
secure mechanical connection for this signal-pair (received I & Q) is
recommended.

2.  Different Phase and Gain values are required whenever a filter "band"
changes (e.g. if you hear relays click when changing frequencies), even if
the frequency change is very small.

3.  The Phase and Gain values required to null a signal can change
significantly from one end of a filter "band" to the other (e.g. from 14.5
MHz to 21.5 MHz).

4.  The image null can change significantly (20 dB) when changing the Preamp
setting from Med (or High) to Low or Off.

5. Transmitter image rejection changes from band to band.  Ideally, it would
be nice to have independent settings for each band.

In addition to these observations regarding Image Rejection, I also noticed
that Carrier Suppression, when operating either USB or LSB, varied from band
to band.  I compared the amount of carrier observed when transmitting in AM
to the amount of "supressed" carrier observed when transmitting DSB
(measurements were all made using the SDR1K's QRP ouput fed, thru an
attenuator, to an 8568B spectrum analyzer).  On 40 meters carrier supression
was on the order of 50 dB or better.  On 20 it was 40 dB, and worse case it
was 35 dB on 12 meters.

I'm curious as to the cause of the varying of carrier supression vs.
frequency.  The fact that there's some carrier on sideband implies (to me)
that there is a DC offset at the inputs of the transmit FST3253 (IC2 on the
TRX board), which at first glance would seem surprising, given that the
outputs of the driving op-amps are AC coupled.  But there *is* a possible
source of DC, which is the DC bias voltage aplied to T1.2.  Perhaps there is
a leakage problem with one or more of the 22 uF caps, or perhaps this
voltage at T1.2, in combination with parasitics at IC2's inputs (and heaven
knows what the 22 uF caps and the DRV135 outputs look like at 24.9 MHz) is
somehow creating an offset?  An interesting experiment might be to add a
low-impedance path (at RF) from IC2's inputs to ground (or perhaps to T1.2)
using some small-valued chip caps and see how this affects carrier
supression...

73,

- Jeff, WA6AHL



<<attachment: winmail.dat>>

Reply via email to