Nick Maclaren wrote:

> Not at all.  "Precision" has been used to indicate the number of
> digits after the decimal point for at least 60 years, 

Not only, remember: Computer memories can't think in powers of ten.

> probably 100; in 40 years of IT and using dozens of programming
> languages, I have never seen "display" used for that purpose.

Yes, but since the representation in computers is based on powers of
two, a certain precision in the dual system, i. e. a fixed amount
of dual places, doesn't correspond with a fixed amount of decimal
places. Thus the rounding while displaying -- just to make it look
prettier. The very minimal additional error is silently accepted.

Regards,


Björn

-- 
BOFH excuse #199:

the curls in your keyboard cord are losing electricity.

-- 
http://mail.python.org/mailman/listinfo/python-list

Reply via email to