I know you guys like to drill holes through hairs, but .... As I follow the discussions on merits of 50/75 ohm cable, cable length changing with ambient temperature, tuning cable to a fraction of wavelength, power supply noise etc.
Does not the "weakest" link determine it's best accuracy? If so, the receiver/electronics and internal programming seems to be the weakest link and all these small nano/pico second variables discussed seems moot, at least with unit in question. If your receiver for example computes your location with a rather large error in lat/lon or altitude, that error I would think would be greater than the sum of all the "small" factors/errors being discussed. I have found, at least with the Thunderbolt receivers I have used, they are rather sloppy in it's location fix but even worse in it's altitude fix. As I understand it, each foot of distance is a little over a nanosecond in delay so would not position/altitude accuracy be the biggest variable, not to mention the proper calculation and offset of antenna cable attenuation/length. I would be curious how the older Thunderbolt units compares to a newer technology receiver/timebase in the "real world". Just a thought _______________________________________________ time-nuts mailing list -- time-nuts@febo.com To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts and follow the instructions there.