Toon Moene wrote:

But even this were fixed, many users would still complain.
That's why I think that the Linux kernel should set the CPU
in double-precision mode, like some other OS's (MS Windows,
*BSD) -- but this is off-topic here.

It's not off-topic.  In fact, Jim Wilson argued this point here:

http://gcc.gnu.org/ml/gcc/2003-08/msg01282.html

There are good arguments on either side of this issue. If you set
double precision mode, then you get more predictable precision
(though range is still unpredictable), at the expense of not being
able to make use of extended precision (there are many algorithms
which can take very effective advantage of extended precision (e.g.
you can use log/exp to compute x**y if you have extended precision
but not otherwise).

Given that there are good arguments on both sides for what the
default should be, I see no good argument for changing the
default, which will cause even more confusion, since programs
that work now will suddenly stop working.

Reply via email to