On Thu, Mar 06, 2003 at 02:23:59PM +0000, Brian J. Beesley wrote:

> On Thursday 06 March 2003 13:03, Daran wrote:

> > However, some time ago, I was given some information on the actual P-1
> > bounds chosen for exponents of various sizes, running on systems of various
> > processor/memory configurations.  It turns out that P4s choose *much
> > deeper* P-1 bounds than do other processors.  For example:
> >
> > 8233409,63,0,Robreid,done,,40000,450000,,Athlon,1.0/1.3,90
> > 8234243,63,0,Robreid,done,,40000,450000,,Celeron,540,80
> > 8234257,63,0,Robreid,done,,45000,742500,,P4,1.4,100
> >
> > The last figure is the amount of available memory.  The differences between
> > 80MB and 100MB, and between 8233409 and 8234257 are too small to account
> > for the near doubling in the B2 bound in the case of a P4.
> 
> Yes, that does seem odd. I take it the software version is the same?

The only information I have is in the table, however I do not think that
differences between the versions can account for the magnitude of the
discrepancy.  When I first received the data, I checked a few of the
exponents to see what bounds my client chose.  (I don't recall which version
I was running at that time.) The results were a perfect match with the
non-P4s, given an identical memory allowance.  I've just done a similar
experiment with my current client (2.22.2), testing these three exponents
with 80, 100, and 120MB, the results were:

MB      B1      B2
80      45000   472500
100     45000   585000
120     45000   585000

(There were no differences between the three exponents.)

I noticed a slight increase in the limits I was getting when I upgraded to
this version, and I think that's what were seeing here.  Even with 120MB the
B2 value is significantly lower on my Duron than with the P4 on 100MB.

> The only thing that I can think of is that the stage 2 storage space for 
> temporaries is critical for exponents around this size such that having 90 
> MBytes instead of 100 MBytes results in a reduced number of temporaries, 
> therefore a slower stage 2 "iteration time", therefore a significantly lower 
> B2 limit.

George modified the software recently so as to estimate memory requirements
more conservatively.  My reason for performing the above test with
120MB was to guarantee that my client saw more temporaries than did the P4
with 100MB.

I looked in the source for an explanation for this some time ago.  There is
a slight difference in the codepath taken by SSE2 capable processors, but I
didn't understand why it should have any such effect.

> I note also that the limits being used are typical of DC assignments...

That's all the info I have.

> ...For 
> exponents a bit smaller than this, using a P3 with memory configured at 320 
> MBytes (also no OS restriction & plenty of physical memory to support it) but 
> requesting "first test" limits (Pfactor=<exponent>,<tfbits>,0) I'm getting B2 
> ~ 20 B1 e.g.
> 
> [Thu Mar 06 12:07:46 2003]
> UID: beejaybee/Simon1, M7479491 completed P-1, B1=90000, B2=1732500, E=4, 
> WY1: C198EE63

I've just tried that exponent.  Setting factored bits to 63, and available
memory to 320MB, I get the same limits as you.

Changing it to a doublecheck, I get B1=40000, B2=610000.  More generally, I
find B2 ~ 15 B1 for doublechecks.

> The balance between stage 1 and stage 2 should not really depend on the 
> limits chosen since the number of temporaries required is going to be 
> independent of the limit, at any rate above an unrealistically small value.

I agree.  The number of temporaries used depends upon the choice of the D
and E parameters, and D is capped at max(2310,sqrt(B2-B1)).  However this
cap only comes into effect with small exponents.  In most cases the
available memory will be the limiting factor.

> Regards
> Brian Beesley

Daran G.
_________________________________________________________________________
Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to