Tim Peters added the comment:

I don't care about correct rounding here, but it is, e.g., a bit embarrassing 
that

>>> 64**(1/3)
3.9999999999999996

Which you may or may not see on your box, depending on your platform pow(), but 
which you "should" see:  1/3 is not a third, it's a binary approximation that's 
a little less than 1/3, so 64 to that power should be less than 4.

There are two practical problems with `nroot`, both already noted:  (1) Its 
very cheap initial guess comes at the cost of requiring multiple iterations to 
get a guess good to 54 bits.  That can be repaired, and already showed how.  
(2) The larger `n`, the more bits it requires.  As Mark noted, at n=50000 we're 
doing multi-million bit arithmetic.  That's inherent - and slow.

My simple fractions.Fraction function suffers the second problem too, but is 
even slower for large n.

But if you give up on guaranteed correct rounding, this is really quite easy:  
as I noted before, a single Newton iteration approximately doubles the number 
of "good bits", so _any_ way of getting the effect of extra precision in a 
single iteration will deliver a result good enough for all practical purposes.  
Note that an error bound of strictly less than 1 ulp is good enough to 
guarantee that the exact result is delivered whenever the exact result is 
representable.

Here's one way, exploiting that the decimal module has adjustable precision:

    import decimal
    def rootn(x, n):
        c = decimal.getcontext()
        oldprec = c.prec
        try:
            c.prec = 40
            g = decimal.Decimal(x**(1.0/n))
            g = ((n-1)*g + decimal.Decimal(x)/g**(n-1)) / n
            return float(g)
        finally:
            c.prec = oldprec

Now memory use remains trivial regardless of n.  I haven't yet found a case 
where the result differs from the correctly-rounded `nroot()`, but didn't try 
very hard.  And, e.g., even at the relatively modest n=5000, it's over 500 
times faster than `nroot` (as modified to use a scaled x**(1/n) as an excellent 
starting guess, so that it too always gets out after its first Newton step).  
At n=50000, it's over 15000 times faster.

For n less than about 70, though, `nroot` is faster.  It's plenty fast for me 
regardless.

Note:  just in case there's confusion about this, `rootn(64.0, 3)` returns 
exactly 4.0 _not_ necessarily because it's "more accurate" than pow() on this 
box, but because it sees the exact 3 instead of the approximation to 1/3 pow() 
sees.  That approximation is perfectly usable to get a starting guess 
(64**(1/3)) good to nearly 53 bits.  The real improvement comes from the Newton 
step using _exactly_ 3.

That said, I have seen rare cases where this (and `nroot`) give a better result 
than the platform pow() even for n=2 (where 1/n _is_ exactly representable).

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue27761>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to