In article <[EMAIL PROTECTED]>, Richard Eich <[EMAIL PROTECTED]> writes: >If I were to set up five or six global locations, each time- >synchronized to a local GPS receiver, how much variability would I >expect there to be between the time as indicated by the GPS >satellites used by each local GPS receiver.
What decimal point are you interested in? nanaseconds? microseconds? milliseconds? Are you interested in the time out of the GPS box or the time out of an NTP server connected to a GPS box? How much money are you willing to spend? The GPS side depends upon how fancy your gear is and how good your antenna location is. The speed of light is a foot per nanosecond. GPS position accuracy for consumer gear is generally within a few 10s of feet so time accuracy should be within a few 10s of nanoseconds. The national standards laboratories use GPS to compare their atomic clocks so you can get very very good if you work hard enough. With a bit of work, you can probably get a NTP server using GPS to be within a few/10s of microseconds. That's good enough so that you can easily measure asymmetries or jumps in network routing. (assuming similar setups on both ends) Getting the time from a good NTP server to other systems gets interesting. -- These are my opinions, not necessarily my employer's. I hate spam. _______________________________________________ questions mailing list [email protected] https://lists.ntp.isc.org/mailman/listinfo/questions
