On Thu, Jan 27, 2011 at 12:55 PM, Perry Sandeen <sandee...@yahoo.com> wrote:

> ...how the heck they were able to calibrate a clock to milliseconds per day 
> back then?

Let it run for 1,000 days, then you only need to be able to measure to
the nearest second to get to ms per day.  Or maybe you can measure to
0.1 seconds so it only takes 100 days.

The trouble is that using this method you don't know the average
error.  A good example is an eccentric gear that makes a second hand
run fast then slow but if averaged over a long period is near perfect.
 I doubt they were able to catch stuff like that.


-- 
=====
Chris Albertson
Redondo Beach, California

_______________________________________________
time-nuts mailing list -- time-nuts@febo.com
To unsubscribe, go to https://www.febo.com/cgi-bin/mailman/listinfo/time-nuts
and follow the instructions there.

Reply via email to