Fair question ... actually a good question.
Short answer: On vintage receivers, the VFO freq [the local oscillator
that heterodynes to the IF freq] is switched for each band ... except of
course if it's a Collins radio. It's a single dial with multiple fixed
scales but multiple frequency determining networks. Calibrating one to
WWV has no bearing on any of the others, and your example is correct.
For a K3 [and all like it], the "VFO" is synthesized from a single,
non-switched source regardless of band. Get it within 1 Hz at 80, and
it will be within 2Hz [or so, it's digital after all] at 40. So, you
want to to do the adjustment at the highest possibly frequency ... all
the lower ones will be *at least* as good.
A critical factor in this procedure is that the WWV signal strength
needs to be hign enough to discern the zero beat clearly. I did mine at
20 MHz when there were lots of sunspots. At the end of 2016, you may
need to settle for a lower WWV. The difference will be tiny in any case.
73,
Fred K6DGW
- Sparks NV DM09dn
- Northern California Contest Club
- CU in the Cal QSO Party 7-8 Oct 2017
- www.cqp.org
On 12/17/2016 9:17 PM, Brian Denley wrote:
Fred, Don: I ask because I am curious. On any older receiver,
calibration at 20 MHz would not guarantee cal below that ( or at any
other frequency ). One could be 5 hz high at 30 MHz but 10 hz low at
7 MHz. Why is the K3 different?
Brian Denley KB1VBF Sent from my iPad
______________________________________________________________
Elecraft mailing list
Home: http://mailman.qth.net/mailman/listinfo/elecraft
Help: http://mailman.qth.net/mmfaq.htm
Post: mailto:Elecraft@mailman.qth.net
This list hosted by: http://www.qsl.net
Please help support this email list: http://www.qsl.net/donate.html
Message delivered to arch...@mail-archive.com