On Tue, 08 Feb 2011 20:05:05 -0000, Nicolas Grekas <nicolas.grekas+...@gmail.com> wrote:

ini_set('precision', 17);

After some testings, here is what I get :

<?php

ini_set('precision', 14);
echo 0.1; // 0.1
echo 0.2; // 0.2
echo 0.3; // 0.3

ini_set('precision', 17);
echo 0.2; // 0.10000000000000001
echo 0.2; // 0.20000000000000001
echo 0.3; // 0.29999999999999999


The default precision of 14 (or 12) must have been chosen to address
this overlong string representation of many simple floats ?
While I agree with you that any data loss must be forbidden, couldn't
this also break existing code ?

Yes, I think it's dangerous to change the default display precision lest we have a ton of applications that currently show 0.2 showing 0.20000000000000001.

Would it be possible to "displays a value based on the shortest
decimal fraction that rounds correctly back to the true binary value",
like python 2.7 and 3.1 do ?
(http://docs.python.org/tutorial/floatingpoint.html)


This may be a good idea for trunk, but I don't think it's feasible for 5.3 for the same reason. Showing "shortest decimal fraction that rounds correctly back to the true binary value" works fine for numbers that are directly input, where the only error is the normal rounding error (i.e., total uncertainty for x is x*2^-53). Once you start making calculations with the numbers the errors start being propagated, so in these scenarios you would still end up with a lot more "ugly" string representations that you have today with the default display precision.

I agree that the information loss in e.g. PDO must be fixed, but it seems more appropriate to fix those problems by forcing another precision only in those cases.

--
Gustavo Lopes

--
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to