Yo Kurt! Note I have changed the Subject of this thread as it no longer had any relationship to libsodium.
And before we can continue, for clarity, I'm going to try to define a few terms. Given the terminology swamp that is NTP, do not try too hard to apply this to the same words defined and used differently by NTP. Even worse NTP is not self consistent. Context is everything. Many will disagree with my terms, for those that do, just think of the names I use as random variable names and try to follow the concept. I worked for, and later with, John Fluke, Mfg. Co., Inc. For those that are too young, John Fluke was David Packard's roommate and also founded his own metrology equipment company. I continue to use the terminology and framework they drilled into their people and users. To make this a little easier to visualize, I'll walk you through a metrology demo I saw, and gave, many times over many years. I spare you the full 30+ minute canned presentation and go directly to some interated result values. Imagine in front of you are two handheld voltmeters, and a super precision voltage source. Both of the voltmeters are what are called "three and a half digit" voltmeters. They can read from 0.000 Volts to 2.999 Volts. To make it simple, assume no autoranging, or even any ranging at all. What is the 'precision' of these two voltmeters? To me, the precision is the LSB on the display, in this case: 1 milli Volt. So the No-Name $5 meter is the same precision as the $250 Fluke meter, right? I certainly think so. Precision is just the smallest unambiguous unit that I can measure. So why do people pay so much more for the Fluke? Time to hook up the 12 digit, 1ppm, NIST traceable calibrated voltage reference to both meters, one at a time. I set the voltage standard to 0.950000000 Volts. Feed that 0.95000000 Volts into the cheap meter. The cheap meter reads exactly 1.000 Volts. Or maybe I had to feed in 1.050000000 Volts to get 1.000 Volts on the cheap meter. Now, what is the 'precision' of the cheap meter? I say it is still 1 milli Volt. What is the 'accuracy'? I say it is 5% off. The cheap meter has a 5% accuracy. Now plug in the Fluke meter. I feed in the same 1.000000000 Volts and I read on the meter 1.000 Volts. How accurate, NIST traceable, would you say the Fluke meter is? You guess 0.1%? Maybe, maybe not. Now I set the calibrator to 1.000490000 and the meter still reads 1.000. I set the meter to 1.00050000 and the meter changes to read 1.001. How accurate is the meter? I say the meter is accurate to 0.001% The Fluke meter can in fact reliably, repeatably, NIST traceably, differentiate between two inputs that are only 0.001% apart! When that Fluke meter said a voltage was 1.000, you know it was between 0.99951 Volts and 1.00049 Volts. You could in fact use it to accurately, repeatedly, NIST traceably tune a voltage divider to 1.000 Volts +/-0.001% Compare that to the cheap meter that is only 5%. And yet they both have a precision of 0.1%. The canned demo in fact went into much more depth, and covered more topics, but I think this illustrates the way I frame measurements. To summarize, in my lexicon, precision is just how many digits I can resolve. Accuracy is how close a measured value is to a traceable standard. Accuracy can be much larger or smaller than precision. None of this to be confused with how NTP defines precision. And NTP never uses the word accuracy. RGDS GARY --------------------------------------------------------------------------- Gary E. Miller Rellim 109 NW Wilmington Ave., Suite E, Bend, OR 97703 g...@rellim.com Tel:+1 541 382 8588 Veritas liberabit vos. -- Quid est veritas? "If you can’t measure it, you can’t improve it." - Lord Kelvin
pgpdcabbu_xer.pgp
Description: OpenPGP digital signature
_______________________________________________ devel mailing list devel@ntpsec.org http://lists.ntpsec.org/mailman/listinfo/devel