On Jul 2, 12:28 pm, David Harvey <[EMAIL PROTECTED]> wrote:

Hi David,

> Returning to a slightly old thread.....

I meant to answer earlier ...


> I have implemented a new algorithm for computing large Bernoulli  
> numbers.
>
> Running on 10 cores for 5.5 days, I computed B_k for k = 10^8, which  
> I believe is a new record. (Recall the Mathematica blog post  from  
> April was for k = 10^7.)

w00t :)

> Essentially it's the multimodular algorithm I suggested earlier on  
> this thread, but I figured out some tricks to optimise the crap out  
> of the computation of B_k mod p.
>
> Patch is up for review here:
>
> http://sagetrac.org/sage_trac/ticket/3542

We should assign an editor for that. The use pthreads is ok, we might
have to disable or port some code for the Windows port There seems to
be multiple pthread libraries for Win32, but I cannot say for sure if
they work or not yet.

> Preprint is here (comments welcome):
>
> http://math.harvard.edu/~dmharvey/bernmm/bernmm.pdf
>
> One data point, on a 2.6GHz opteron:
>
> sage: time x = bernoulli(10^5, algorithm="pari")
> CPU times: user 0.16 s, sys: 0.00 s, total: 0.16 s
> Wall time: 20.57 s
> sage: time y = bernoulli(10^5, algorithm="bernmm")
> CPU times: user 6.54 s, sys: 0.00 s, total: 6.54 s
> Wall time: 6.54 s
> sage: time z = bernoulli(10^5, algorithm="bernmm", num_threads=3)
> CPU times: user 6.54 s, sys: 0.03 s, total: 6.57 s
> Wall time: 2.71 s
> sage: x == y
> True
> sage: x == z
> True
>
> Timings for some bigger k:
>
> k = 10^7:
> PARI/GP = 75 h
> Mathematica = 142 h
> bernmm (1 core) = 11.1 h
> bernmm (10 cores) = 1.3 h
>
> k = 10^8:
> bernmm (10 cores) = 131h = 5.5 days
>
> david

Very, very impressive to say the least.

Cheers,

Michael

--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~----------~----~----~----~------~----~------~--~---

Reply via email to