ini_set('precision', 17);

After some testings, here is what I get :

<?php

ini_set('precision', 14);
echo 0.1; // 0.1
echo 0.2; // 0.2
echo 0.3; // 0.3

ini_set('precision', 17);
echo 0.2; // 0.10000000000000001
echo 0.2; // 0.20000000000000001
echo 0.3; // 0.29999999999999999

?>

The default precision of 14 (or 12) must have been chosen to address
this overlong string representation of many simple floats ?
While I agree with you that any data loss must be forbidden, couldn't
this also break existing code ?
Would it be possible to "displays a value based on the shortest
decimal fraction that rounds correctly back to the true binary value",
like python 2.7 and 3.1 do ?
(http://docs.python.org/tutorial/floatingpoint.html)

Just my 2cts :)

Nicolas

As I've shown in the previous post, "looks better" doesn't mean "is more accurate". It's true that precision has been set to 12 or 14 in order to look better with some small decimal fractions, but that can't be the only concern when that same precision is used also for serialization during database escaping.

It's hard to imagine when the display precision of an echoed float being longer or less aesthetically pleasing to a human eye may break code, but when real precision is lost, and this is sent to the database like that, this is a much more tangible problem.

In terms of formatting of floats for humans, we still have printf() and number_format(), which allow control independent of the ini setting (which people use when they need that explicit control).

Stan Vass

--
PHP Internals - PHP Runtime Development Mailing List
To unsubscribe, visit: http://www.php.net/unsub.php

Reply via email to