The theoretical complexity of all the algorithms that rely on
recurrences is supposed to be n^2. But this doesn't take into account
the size of the numbers themselves. When you do this they are all
about n^3 as far as I can see. You can use Ramanujan identities, the
Akiyama-Tanigawa algorithm, the identity used by Lovelace, but all are
n^3 or so.

The theoretical complexity of the version using the zeta function
looks something like n log n steps at precision n log n, i.e. time n^2
(log n)^2.

Bill.

On 2 May, 21:24, David Harvey <[EMAIL PROTECTED]> wrote:
> One more data point (2.6GHz opteron):
>
> sage: time x = bernoulli(60000)
> Wall time: 3.79
>
> sage: time x = bernoulli(120000)
> Wall time: 16.97
>
> sage: time x = bernoulli(240000)
> Wall time: 118.24
>
> sage: time x = bernoulli(480000)
> Wall time: 540.25
>
> sage: time x = bernoulli(960000)
> Wall time: 2436.06
>
> The ratios between successive times are:
>
> 4.47757255936675
> 6.96758986446671
> 4.56909675236806
> 4.50913465987969
>
> If we guess that it's "really" 4.5, then the complexity is N^(log
> (4.5) / log(2)) = N^2.17. This is puzzling; I thought the algorithm  
> was supposed to be better than quadratic. Does anyone know what the  
> theoretical complexity is supposed to be?
>
> Anyway, extrapolating gives about 4.5 days, pretty much the same as  
> what Tom estimates. I'm going to start it running now.
>
> david
--~--~---------~--~----~------------~-------~--~----~
To post to this group, send email to sage-devel@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at http://groups.google.com/group/sage-devel
URLs: http://www.sagemath.org
-~----------~----~----~----~------~----~------~--~---

Reply via email to