On Mon, Apr 26, 2010 at 8:00 PM, Jason Grout
<jason-s...@creativetrax.com> wrote:
> When multiplying two univariate polynomials over RR[x], Sage uses the naive
> O(n^2) algorithm.  The docstring for _mul_ cites numerical stability as the
> reason to not use asymptotically faster algorithms:
>
> "Here we use the naive `O(n^2)` algorithm, as asymptotically faster
> algorithms such as Karatsuba can have very inaccurate results due to
> intermediate rounding errors. "
>
> (this is followed by an example where Karatsuba multiplication is shown to
> mess up compared to naive multiplication)
>
> Can we do any better?  Does anyone know of any algorithms for numerically
> stable fast univariate polynomial multiplication (better than the naive
> algorithm)?

1) I don't know what I'm talking about, but the first thing that pops
into my mind (in 1 second, literally) is to simply bump up the
precision some heuristic amount, then do the arithmetic using some
asymptotically fast algorithm using interval arithmetic, and see what
the precision of the result is.  If it is sufficient, return the
non-interval answer.  If it is not sufficient, increase the precision
and try again.  I think the complexity of arithmetic with intervals is
only a constant factor (2? 4?) compared to not using intervals, so the
complexity of what I just suggested should usually have nearly the
same O complexity as some asymptotically fast algorithm.

2) There is something called Fateman multiplication, which might be
relevant.  Didier Deshommes put it in Sage for some reason like 4
years ago.  Type f._mul_fateman? for more.    Also, I've cc'd Richard
Fateman himself, in hopes you can say something about your question.

 -- William

-- 
To post to this group, send an email to sage-devel@googlegroups.com
To unsubscribe from this group, send an email to 
sage-devel+unsubscr...@googlegroups.com
For more options, visit this group at http://groups.google.com/group/sage-devel
URL: http://www.sagemath.org

Reply via email to