jlevine wrote: > The difficulty is that timing laboratories do not distribute > frequency as distinct from time. > Precision frequency comparisons are done by measuring the evolution of > the time difference > (which is often expressed as a fraction of a cycle for the highest- > precision comparisons) > between a device under test and a calibration source. Thus, changing > the length of the second > effectively changes the frequency that is being transmitted.
While I thought that I could deal with this objection simply by distributing TAI in addition to my new form of civil time, further thought, and researching information about the time scale that preceded the current system has allowed me to suggest an alternate possibility that deals with this objection, although it may have flaws of its own. In a year that would otherwise have one leap second, instead of adding some 30-odd nanoseconds to every second... perhaps it would be better to simply add *one millisecond* to one second every eight hours for the first 333 1/3 days of the year. This accomplishes the goal of sweeping the leap second under the rug without preventing time signals from serving as a source of frequency calibrations. If you want to set an accurate frequency, just remember to compare time drift *within* one of the appropriate eight-hour time periods... if you want, on the other hand, to calibrate a clock so as to *include* this year's number of leap seconds, just set it so as to keep time based on two comparisons to the time signal that are some multiple of eight hours apart! John Savard _______________________________________________ questions mailing list [email protected] https://lists.ntp.isc.org/mailman/listinfo/questions
