On Jun 7, 2007, at 9:50 AM, wayne burdick wrote:
Brian Lloyd wrote:
....being able to change the S-meter slope and intercept strikes
me as being a bad option. To me that is like changing the
calibration of a voltmeter or wattmeter because you like the
needle pointer to move differently.
I'd agree with you if there were a single world-wide standard for
S-9, and no need to compensate for slight differences in receive
gain from one unit to the next. But reality is that S-meters
usually require both scale and offset calibration.
Yes, they do. But they can be calibrated to a standard regardless of
the gain of the radio.
This is also more flexible. As I mentioned earlier, I set my S-
meters up for 4 dB per S-unit. Here's why: I like a greater degree
of sensitivity in the S-meter so I can see the effects of things
like preamp on/off, filter changes, notch, NR, etc. It also makes
band-pass filters easier to tweak when there isn't a scope or AF
voltmeter handy, and you can more readily see the effect of an
improved antenna during A/B testing.
What you are doing is changing the calibration instead of changing
the *resolution*. What you really want to do is to be able to resolve
smaller changes easily. So blow up the scale. Add calibration points
for half S-units. That would give you 3dB points on the meter. That
is even better than your 4dB resolution!
If hams wanted to be precise in assessing signal levels, we'd
report them in dBm and do a lot of averaging.
I agree. I would prefer to have a meter calibrated in dBm but we have
used S-units for so long that it is part of the fabric. Heck, we
still use the English Standard system of measurements in the US. I
have to switch back and forth between metric and ES all the time. And
sometimes it is convenient to measure resistance in ohms or
conductance in mohs even though we know they are really the same thing.
But for most operators this is a hobby, not a job :)
It is a technical hobby. We measure voltage, resistance, current, and
power to very accurate levels. Why should we therefore say that
accurately measuring receive signal level is unimportant? You
yourself say that you use the S-meter to:
"...see the effects of things like preamp on/off, filter changes,
notch, NR, etc. It also makes band-pass filters easier to tweak when
there isn't a scope or AF voltmeter handy, and you can more readily
see the effect of an improved antenna during A/B testing."
Clearly you are using it as an instrument of measurement. Why not
have it conform to a standard so that the readings are useful rather
than just randomly relative?
I think this gets back to my comment about resolution. If you are
using a quantized bar-graph display it is easier to change the
calibration than to change the resolution. OTOH, three digits would
be nice or even an analog meter. (I actually still prefer analog
meters for a lot of things, especially doing calibrations involving
tweaking things.)
Never mind.
73 de Brian, WB6RQN
Brian Lloyd - brian HYPHEN wb6rqn AT lloyd DOT com
_______________________________________________
Elecraft mailing list
Post to: Elecraft@mailman.qth.net
You must be a subscriber to post to the list.
Subscriber Info (Addr. Change, sub, unsub etc.):
http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/subscribers.htm
Elecraft web page: http://www.elecraft.com