Brian,

The fuzz indeed is intended only for gettimeofday() and is not effective in simulation. My commment on simulation should have been on the roundoff/bias considerations and the tick interval. I have run the simulation with a simulated tick interval of one second (!) and the algorithms work rather well. In other words, the expected errors relative to a fine-grained resolution are suprisingly small. It all depends on careful residual management and unbiased averaging.

Brian Utterback wrote:

I think that this is Damion's point. If you look at the code itself,
the fuzz code is not used if:

1. You are using clock_gettime.
or
2. You are using getclock
or
3. You are using the simulator.

So, the fact that the simulator is doing okay is irrelevant, since it does not use the fuzz code. But more to the point, the clock_gettime and getclock functions claim to return nanoseconds, so there are only two
bits available to fuzz, so the code does not bother to fuzz those last
two bits. Damion's point is that the actual precision of the clock
on his system is much more coarse, so more bits are really non-significant and should be fuzzed, but they are not.

I don't think he is actually commenting on the accuracy of the time
derived from fuzzed values, just the fact that he is not seeing any
fuzz at all.

David L. Mills wrote:

Damion,

THe ntpd in ntp-dev has been run in simulation with tick = 10 ms and done amazingly well. The low order nonsignificant bits are se to a random fuzz that apparently averages out just fine.

Dave

Damion de Soto wrote:

Brian Utterback wrote:

Yes and no. If your system supports either clock_gettime or getclock,
then the code does not bother with the random bitstring, since there
are only two unused bits to set. Not worth the trouble.



Thanks, but I have a system here that has very low resolution system clock,
ntpd correctly detects this via default_get_precision() as:
Feb 13 07:01:31 ntpd[59]: precision = 10000.000 usec

I have clock_gettime() available to me, but the nanoseconds values will be mostly wrong, since 10ms only gives me 7 bits of precision. This means all 64bits of the fractional seconds in the Transmit Timestamp are nearly always the same.


Has no-one else ever run into this before?

Regards,





_______________________________________________
questions mailing list
[email protected]
https://lists.ntp.isc.org/mailman/listinfo/questions

Reply via email to