Re: Mersenne: P-1 Puzzle

2002-06-12 Thread Daran

- Original Message -
From: Brian J. Beesley [EMAIL PROTECTED]
To: Daran [EMAIL PROTECTED]; [EMAIL PROTECTED]; Anurag Garg
[EMAIL PROTECTED]
Sent: Tuesday, June 11, 2002 8:23 PM
Subject: Re: Mersenne: P-1 Puzzle

 On Tuesday 11 June 2002 06:13, Daran wrote:

 [... snip ... interesting but non-contentious]

  Very noticable is the proportion of exponents - in all three ranges -
  which are not getting a stage two effort at all.  26 out the 85
exponents
  between 795 and 796000, 24 out of 54 between 1550 and
  15505000, 35 out of 57 between 33219000 and 33223000.  I do not
  believe that large numbers of P4 systems are being shipped with just
  8MB of RAM!

 This is true. However the philosophy of the project, correctly in my view,
 is that the software should not cause noticeable deterioration in the
 performance of a system when it is being run in the background to normal
 work.

I agree.  My remarks were intended to make the case for spinning P-1 off
into a separate work type, (yeah, I know, it's difficult to change the
server code), and to encourage other readers of this list to consider
focussing on P-1 work.

[...]

 The default has to be safe;...

Again, I agree.  While there will be some people who have made a deliberate
decision not to allocate extra memory, in many cases people will simply have
accepted the default, which means that some machines which could allocate
more memory to stage 2 without adversely affecting the user won't be
configured to do this.  However that same tendency to accept defaults puts
GIMPS programmers under an obligation to set those defaults conservatively.

 ...IMO the current default memory allowance of 8MB
 is entirely reasonable, even though it causes P-1 to run stage 1 only for
 any realistic assignment, and even though _new_ systems are usually
delivered
 with at least 256 MB RAM.

Against that is the observation that the basic memory footprint has barely
changed in the over three years I've been with the project, while typical
system memory has increased by a factor of four or more.  A default set to
10% of available memory would allow a 256MB system to perform a modest stage
2 on low assignments in the test range, and on DC assignments, while still
using proportionately less memory than three years ago.  The effect of this
could be further mitigated if the memory dialog included radio buttons to
limit further the memory usage to 'minimum' (default), with other options
being 'reasonable' and 'desireable', (as described in the helpfile) as well
as 'maximum', and 'Do not run'.

Thus the default would be to run a minimal stage 2 provided it could be done
in 10% of system memory or less.  I would consider this to be reasonable and
conservative.

 Running P-1 on a 10 million digit exponent requires in excess of 64 MB
 memory to be allocated in order to run stage 2 at all. That's a lot to ask
 as a default!

It is.  OTOH if the box has 1GB installed, then it's not so much.

 Regards
 Brian Beesley

Daran


_
Unsubscribe  list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ  -- http://www.tasam.com/~lrwiman/FAQ-mers



Re: Mersenne: P-1 Puzzle

2002-06-11 Thread Daran

- Original Message -
From: Daran [EMAIL PROTECTED]
To: [EMAIL PROTECTED]
Sent: Monday, June 10, 2002 6:54 AM
Subject: Re: Mersenne: P-1 Puzzle

 This appears to have happened to me at least once.  I'll spend some time
 later today cross-referencing my WTD file against pminus1.txt to see if
 there are any more I don't need to do.

Yesterday morning, shortly after 6 UTC, I reserved 42 DC exponents between
7.08M and 7.74M, i.e. all were recently expired exponents.  Of these 27 had
the P-1 bit set, all of which were P-1 complete, according to the
pminus1.txt file dated June 9.  Of the 15 with P-1 bit clear, one was in
fact P-1 complete.  I also reserved 32 test exponents between 12.7 and
13.8M.  Of these, 27 had the P-1 bit set, 5 clear.  None of these disagreed
with pminus1.txt.  Finally I checked 28 DC exponents which I had previously
reserved, between 7.23M and 7.74M.  All had the P-1 bit clear.  (I routinely
unreserve any exponents I get with P-1 bit set.)   Of these, two were P-1
complete.

From this, admittedly small and non-random sample, it appears that about
one-third of *reassignments* of expired DC exponents have the P-1 bit clear,
and that about one in fifteen of these are incorrect, leading to a redundant
P-1 rerun by the new owner.  I don't know what proportion of DC assignments
are reassignments, but given that P-1 testing takes about 2% of the time to
run a DC test, I would suggest that the impact of this problem upon the
project as a whole is negligible.

However, for anyone processing a lot of exponents, particularly those
specialising in collecting reassignments, it might be worth putting together
a script to identify these anomalies.

With 512MB I probably have considerably more available memory
 than the average machine doing DCs now.  That will be less true for
 machines doing DCs in the future, and probably not true at all for
 machines doing first time tests.

Inspecting pminus1.txt confirms this.  I would do a 796 DC exponent with
B1=45000,B2=675000.  Ignoring those which were obviously P-1ed as first-time
test exponents, or where the tester has chosen to go well beyond the optimal
limits, there are very few in this range which have been factored deeper
than this, and many which have had stage 1 only or a very limited stage 2.
I would do a 1550 test exponent with B1=19,B2=4227500, which is
better than average, but not to the same extent as with the DC range.  And I
can P-1 many DC assignments in the time I'd take to do a single test
assignment.  I'd do a 10M digit exponent with B1=375000,B2=8156250, still
better than average, but there are quite a few that go much deeper.

Very noticable is the proportion of exponents - in all three ranges - which
are not getting a stage two effort at all.  26 out the 85 exponents between
795 and 796000, 24 out of 54 between 1550 and 15505000, 35 out of 57
between 33219000 and 33223000.  I do not believe that large numbers of P4
systems are being shipped with just 8MB of RAM!

Regards

Daran


_
Unsubscribe  list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ  -- http://www.tasam.com/~lrwiman/FAQ-mers



Re: Mersenne: P-1 Puzzle

2002-06-09 Thread Brian J. Beesley

On Sunday 09 June 2002 08:22, Daran wrote:
 I'm currently concentrating exclusively on P-1 work.  The primenet server
 doesn't support this as a dedicated work type, so my procedure is to
 reserve some DC exponants, imediately unreserve any which have the P-1 bit
 already set, P-1 test the rest, then unreserve them without doing any LL
 testing.

 One problem I have discovered is that the server doesn't always 'recognise'
 that a P-1 result has been returned.  It can take several days before my
 individual account report removes the * indicating that factoring work is
 necessary.  In these cases I hold on to the exponant until the result is
 recognised in order to stop the subsequent 'owner' from doing a redundant
 P-1 check.  In other cases, the P-1 result is recognised imediately.

Though I'm not looking for P-1 specifically, I have seen something similar on 
a large number of occasions.

My current assignment report - the DC part of which follows - contains a 
number of examples. 

 6493831 D   64   3.3  33.8  93.8  07-Jun-02 07:25  06-Jun-02 
06:02  cabbage 0 v18
 6530189 D   64   2.3  27.8  64.8  08-Jun-02 06:02  07-Jun-02 
06:02  nessus-b  266 v19/v20
 6672569 D   64  31.3  13.8  73.8  14-May-02 07:43  09-May-02 
06:05  cabbage 0 v18
 6881321 D   64   6.3  23.8  63.8  06-Jun-02 06:06  03-Jun-02 
06:06  nessus-j  332 v19/v20
 6972001 D*  64   0.3  14.7  60.7   09-Jun-02 
04:02  caterpillar   654 v19/v20
 7009609 D   63   394908824.3   9.8  64.8  07-Jun-02 06:04  16-May-02 
06:06  nessus-m  266 v19/v20
 7068857 D   63   588757825.3   0.8  60.8  06-Jun-02 06:06  15-May-02 
06:05  nessus-j  332 v19/v20
 7076669 D*  64   561798830.3   3.8  63.8  07-Jun-02 06:02  10-May-02 
06:04  nessus-b  266 v19/v20
 7099163 D   63   269335914.3  11.8  65.8  09-Jun-02 06:26  26-May-02 
06:43  T4070 366 v19/v20
 7908091 D   64   308019117.7  15.4  60.4  08-Jun-02 21:12  22-May-02 
19:17  broccoli  400 v19/v20
 7937717 D   64   235929510.5   7.6  60.6  09-Jun-02 02:04  30-May-02 
00:30  caterpillar   654 v19/v20
 7938407 D   64   131072010.3  12.3  60.3  08-Jun-02 20:29  30-May-02 
04:16  vision.artb   495 v19/v20
 7940447 D   64   9.8  16.8  65.8  09-Jun-02 06:24  30-May-02 
17:39  Simon1   1002 v19/v20
 7951049 D   64 65536 7.5  10.7  60.7  09-Jun-02 04:31  02-Jun-02 
00:40  rhubarb   697 v19/v20

6972001 and 7076669 are starred although the fact bits column seems to 
indicate that both trial factoring to 2^63 and P-1 have been run. This is 
_definitely_ true for P-1 on 7076669, the fact is recorded on my system in 
both results.txt  prime.log. So far as 6972001 is concerned, the database 
(dated 2nd June) indicates P-1 has been run to a reasonable depth but trial 
factoring has only been done through 2^62. My system definitely won't have 
done any more trial factoring yet, let alone reported anything, since that 
system is set up with v22 defaults i.e. defer factoring on new assignments 
until they reach the head of the queue.

7009609, 7068857  7099163 are not starred although the fact bits column 
is one short. The nofactor  Pminus1 databases (dated 2nd July) give 
these all trial factored through 2^62  Pminus1 checked to B1=35000, 
B2=297500 (or higher). The P-1 limits seem sensible for DC assignments, but 
shouldn't these have been trial factored through 2^63 like most of the other 
exponents in this range?

 Currently, I have nine exponants 'warehoused' whose P-1 results have been
 returned but not recognised, the oldest was done on May 14, which is rather
 longer than I would expect.  There's no question that the server has
 correctly recieved the result, because it is contained in a recent version
 of the pminus1.zip file downloaded this morning along with another four
 exponants 'warehoused' from May 20.  Three more, whose results were
 returned on June 3 have not yet been recorded in this file.

 There is an entry in the file for the last of the nine, returned on June 5,
 but the limits are much smaller than the test I did.  The most likely
 explanation is this is a previous owner's P-1 result which wasn't
 recognised before the exponant was given to me.

I wonder what happens if you're working like Daran and someone returns a P-1 
result independently (either working outside PrimeNet assignments, or 
perhaps letting an assigned exponent expire but then reporting results); if 
PrimeNet gets two P-1 results for the same exponent, which does it keep?

This is not trivial; e.g. if you get no factors, B1=10, B2=100 and 
no factors, B1=20, B2=20 there might still be a factor which would 
be found if you ran with B1=20, B2=100. Also, if the database says 
that P-1 stage 1 only has been run (probably due to memory constraints on the 
system it ran on), at what point is it worthwhile running P-1 for the 
possible 

Re: Mersenne: P-1 Puzzle

2002-06-09 Thread Nick Glover

At 13:18 09/06/02 +, Brian Beesley wrote:
6972001 and 7076669 are starred although the fact bits column seems to
indicate that both trial factoring to 2^63 and P-1 have been run. This is
_definitely_ true for P-1 on 7076669, the fact is recorded on my system in
both results.txt  prime.log. So far as 6972001 is concerned, the database
(dated 2nd June) indicates P-1 has been run to a reasonable depth but trial
factoring has only been done through 2^62. My system definitely won't have
done any more trial factoring yet, let alone reported anything, since that
system is set up with v22 defaults i.e. defer factoring on new assignments
until they reach the head of the queue.

7009609, 7068857  7099163 are not starred although the fact bits column
is one short. The nofactor  Pminus1 databases (dated 2nd July) give
these all trial factored through 2^62  Pminus1 checked to B1=35000,
B2=297500 (or higher). The P-1 limits seem sensible for DC assignments, but
shouldn't these have been trial factored through 2^63 like most of the other
exponents in this range?

Regarding the star, I've noticed that if an exponent doesn't need any 
factoring when you check it out, the star pretty much always stays there 
throughout the entire test.  If an exponent does need factoring, the star 
usually disappears after you finish it ( Daran's observations being a 
counterexample ).

However, I have noticed situations where there are discrepancies with the 
factoring bit depths.  Specifically:

7063451 D 63

This one has P-1 factoring done ( which I did ), but is only factored to 
2^62 ( instead of 2^63 ).  The nofactor.cmp file on the GIMPS status page 
agrees with this.  However, my worktodo.ini says 
DoubleCheck=7063451,63,1, which possibly has prevented me from doing 
trial factoring that I would have otherwise done.

Also when I checked out these exponents they had P-1 done, but needed trial 
factoring to 2^63:

6518503 D* 63 -- DoubleCheck=6518503,62,1
6528541 D* 63 -- DoubleCheck=6528541,62,1

I did the trial factoring on one machine with Factor=6518503,62, and put 
them on another machine with DoubleCheck=6518503,63,1.  When I finished 
the trial factoring, the star went away, but the factored depth did not 
update.  I decided to release 6518503 back to the server, and someone else 
checked it out, and it still looks like 6518503 D* 63, so I fear they may 
repeat the factoring I did.  Was this caused by finishing the factoring 
with a Factor= line instead of a DoubleCheck= line?






Nick Glover
[EMAIL PROTECTED]
Computer Science, Clemson University

It's good to be open-minded, but not so open that your brains fall out. - 
Jacob Needleman

_
Unsubscribe  list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ  -- http://www.tasam.com/~lrwiman/FAQ-mers



Re: Mersenne: P-1 Puzzle

2002-06-09 Thread Daran

- Original Message -
From: Brian J. Beesley [EMAIL PROTECTED]
To: Daran [EMAIL PROTECTED]; [EMAIL PROTECTED]
Sent: Sunday, June 09, 2002 2:18 PM
Subject: Re: Mersenne: P-1 Puzzle

 6972001 and 7076669 are starred although the fact bits column seems to
 indicate that both trial factoring to 2^63 and P-1 have been run. This is
 _definitely_ true for P-1 on 7076669, the fact is recorded on my system in
 both results.txt  prime.log.

It also should be recorded in the worktodo.ini file for that exponent.  Now
if you were to unreserve that exponent, then contact the server sufficiently
late in the day so that it is immediately reassigned back to you (rather
than
giving you a smaller one), then check the WTD file, then you'll find that
the P-1 bit will have been unset.  (This is how I discovered the problem in
the first place.)  There is a small risk of losing the assignment, if
someone checks it out seconds after you unreserve it.

 So far as 6972001 is concerned, the database
 (dated 2nd June) indicates P-1 has been run to a reasonable depth but
 trial factoring has only been done through 2^62. My system definitely
 won't have done any more trial factoring yet, let alone reported anything,
 since that system is set up with v22 defaults i.e. defer factoring on new
 assignments until they reach the head of the queue.

I'll bet its worktodo.ini entry also has the P-1 bit unset, meaning that you
will do a redundant P-1 unless you manually fix it.  What does it give for
the factored bits?  62, or 63?

 7009609, 7068857  7099163 are not starred although the fact bits
 column is one short. The nofactor  Pminus1 databases (dated 2nd July)
 give these all trial factored through 2^62  Pminus1 checked to B1=35000,
 B2=297500 (or higher). The P-1 limits seem sensible for DC assignments,
 but shouldn't these have been trial factored through 2^63 like most of the
 other exponents in this range?

Again, what does your WTD file say?

[...]

 I wonder what happens if you're working like Daran and someone returns a
 P-1 result independently (either working outside PrimeNet assignments,
 or perhaps letting an assigned exponent expire but then reporting
results);

Or if someone is working like me who hasn't identified the problem.  If they
unreserve an exponent whose P-1 hasn't been recognised by the server, then
the next owner will do another one, with possibly different limits.

This appears to have happened to me at least once.  I'll spend some time
later today cross-referencing my WTD file against pminus1.txt to see if
there are any more I don't need to do.

 This is not trivial; e.g. if you get no factors, B1=10, B2=100
 and no factors, B1=20, B2=20 there might still be a factor
 which would be found if you ran with B1=20, B2=100. Also, if
 the database says that P-1 stage 1 only has been run (probably due to
 memory constraints on the system it ran on), at what point is it
worthwhile
 running P-1 for the possible benefit of finding a factor in stage 2?

That question generalises.  If the database shows that a shallow P-1 has
been run, at what point is it worth running a deeper one, and with what
limits?

Suppose the a priori optimal bounds for an exponent are, for example
B1=4,B2=60, but it turns out that only stage 1 has been run to
4.  Assuming that it's worth re-running the P-1 at all, it might be
better to drop the B1 limit - to 35000, say, and increase the B2 limit.
This would reduce the factor-space overlap with the first test.

What's missing in all this is any kind of quantitative analysis.  In any
case, as long as there are exponents which haven't had a P-1 at all, I
prefer to stick to them.

[...]

 Daran, my advice would be to concentrate on exponents above the current DC
 assignment range which have already been LL tested but not P-1 checked, or
 on exponents above the current LL assignment range which have been trial
 factored but not P-1 checked, according to the official database (updated
 weekly - you will need the pminus1, nofactor  hrf3 database files, plus
 the decomp utility to unscramble nofactor).

I have these files, but following your advice would undermine my rational
for
doing this.  With 512MB I probably have considerably more available memory
than the average machine doing DCs now.  That will be less true for machines
doing DCs in the future, and probably not true at all for machines doing
first time tests.

I can work around the problem while staying within the current DC
range.  It's just an irritation.

 Regards
 Brian Beesley

Daran


_
Unsubscribe  list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ  -- http://www.tasam.com/~lrwiman/FAQ-mers