On Tuesday, 27 February 2018 at 00:04:59 UTC, H. S. Teoh wrote:

A 64-bit double can only hold about 14-15 decimal digits of precision. Anything past that, and there's a chance your "different" numbers are represented by exactly the same bits and the computer can't tell the difference.

T

I really miss not having a (C# like) decimal type.

Reply via email to