I've got two (x86_64 Linux) machines between which I need to
determine the relative time difference. The problem is, the
machines are not directly connected and only one is connected to the
Internet.
The topology looks like this:
box_1 <--VPN--> box_common <--LAN--> box_2
It's not so important that box_1 and box_2 are synced, but that
their time difference is accurately measured.
So, box_1 and box_common sync via NTP to (different) public stratum
2 NTP servers. box_2 syncs to box_common.
I wrote a simple program to try an determine the time difference
between box_1 and box_2. The program works like this:
box_1 and box_2 run the program in "server" mode; when queried, the
server program calls gettimeofday() and returns the timeval result
via UDP.
box_common runs the program in "client" mode. client mode does the
following:
- record the current time via gettimeofday() as t1
- send a UDP packet to the remote server
- wait for the UDP response (i.e. the remote machine's actual
time)
- record the time of the UDP response as t2
- define "remote calculated time" as the midpoint between t1 and
t2
- calculate the difference between "remote calculated time" and
the actual remote time as provided by the UDP response; call
this t_d
Once I've done the above for box_1 and box_2, I subtract t_d1 and
t_d2.
I'm just looking for some feedback on my approach: is it valid (i.e.
am I overlooking something?), are there better approaches, etc?
Thanks!
Matt
_______________________________________________
questions mailing list
[email protected]
https://lists.ntp.isc.org/mailman/listinfo/questions