On Tue, Nov 30, 2010 at 1:43 PM, Kagamin <s...@here.lot> wrote:

> Walter Bright Wrote:
>
> > How do you decide how many bits should be enough for any algorithm?
> >
> > The thing is, the FPU has 53 bits of precision and so ought to be correct
> to the
> > last bit.
>
> It's not me, it's the programmer. He was disgusted that his algorithm
> produced garbage, which means, the error was unacceptable. Mat be it was 1%,
> may be 80%, I don't, that was his decision, that the result was
> unacceptable. The bug description assumes the problem was in the last bit,
> which means, he wanted precision higher than the machine precision.
>

What programmer? What algorithm? As far as I can tell, this was found when
testing a library explicitly for accuracy, not in an application, so your
argument doesn't apply.

Reply via email to