Re: FW: Mersenne: Re: Factoring Failure?
On Tue, 2 Oct 2001 19:52:42 -, [EMAIL PROTECTED] wrote: >However if it could be established that all the "missed" factors >reported were the work of one user, perhaps it would be worth fixing >the database to force rerunning of trial factoring for those factoring >assignments run by that user when the exponents are reassigned >for double checking (or LL testing). Given the scale of the bad results (probably a fair bit over 60 exponents, at a rough guess), if I were king, I would just block the responsible user until I got a reasonable exponent. Many of the numbers listed in the original post, however, are already behind the "main front" of double-checking. Of course, we would only expect to be finding those that are, but I'd guess no more than 200-300 exponents are involved (and likely less). If it was a computer error of some sort, I wouldn't expect to see that many errors. That said, I vaguely recall reading somewhere that some versions of Windows always give the same memory range to the same program (the context was that what appears to be an error in a given program may cause major general problems under Linux). If that is the case, is it possible that every time Prime95 on a given system started up its executable was loaded on top of a bad range of memory in just such a way as to make it impossible to find a factor (say, by changing the expected output of the function when one is found). This is speculation on my part, of course, but I think it's worth mentioning Nathan _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
On Wed, 03 Oct 2001 18:29:52 -0700, Gerry Snyder <[EMAIL PROTECTED]> wrote: >But, at least in theory, every Mersenne number proven non-prime will >eventually be factored. Again, to me, so what? At least the LL test >showed that further factoring activity would eventually succeed. It might be pointed out that we are still finding factors of numbers in the same size range as some found to be prime in the 1950's, and that it may well be (due to such things as the uncertainity principle, the speed of light, and the need to not have to worry about whether there actually is an electron crossing a closed switch when one ought to) impossible to factor some of the numbers now being tested. Right now, it's well under an hour's work to prove *any* 700 digit number prime, but factoring general numbers of the same size in an hour would be a very good way to get very rich (legally or otherwise) very quickly. >PS I just got a chuckle from imagining a very competitive team tearing >down an opponent by finding what numbers the opponent had done LL tests >on, and factoring them. If things are being done properly, they should be better off by far doing their own tests, since the factoring bounds are chosen to stop when it's no longer worthwhile to continue factoring in hopes of averting both the first and second tests. Nathan _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
Carleton Garrison wrote: > > [? wrote:] > > > > This is why I like that you lose credit for a LL-test if someone else > > finds a factor later, or if two other independant checks prove your > > result to be wrong. > > Me too. I understand that George's top producer page does this, while the > PrimeNet stat page does not. PrimeNet really needs this capability. To me, there is no question that an LL test that is shown to be wrong should not count for anything. The number still required two more LL tests, so that it as if the erroneous one had not been done. But, at least in theory, every Mersenne number proven non-prime will eventually be factored. Again, to me, so what? At least the LL test showed that further factoring activity would eventually succeed. I have nothing against George doing things that way. (When I play ball with him, I play by his rules or I don't play at all. You know why? Because it's his ball, that's why.) Seriously, I can see some point to doing things that way, but I would do probably do it differently. But even more seriously, I'm just glad to be in the game, and I am grateful to George and all the others who have made it easy and fun to participate. Gerry PS I just got a chuckle from imagining a very competitive team tearing down an opponent by finding what numbers the opponent had done LL tests on, and factoring them. _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
> > I believe the idea of trying to skip P-1 factoring was talked about within > > the last 3 or 4 months. Apparently there are people who would just prefer > > to get credit for doing LL work than to find factors. > > This is why I like that you loose credit for a LL-test if someone else > finds a factor later, or if two other independant checks prove your > result to be wrong. Me too. I understand that George's top producer page does this, while the PrimeNet stat page does not. PrimeNet really needs this capability. Carleton Garrison LL#163 F#295 G#253 www.teamprimerib.com _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
> At 04:07 PM 9/30/2001 -0700, Daniel Swanson wrote: > >I went through the Cleared Exponents > >report looking for other examples of factors found during > double-checks that > >should have been found during the initial factorization. > > 5977297 53 DF6726544627832489 > > 6019603 57 DF 137024179940485697 > > 7019297 57 DF 160100125459121849 > > 7020641 58 DF 226230108157229263 > > 7025987 56 DF 74052063365823791 > > 7027303 55 DF 31090234297428433 > >10159613 56 DF 68279769831982367 > >Were numbers in this range all originally factored by the same user > >or computer? > Either way, GIMPS > has never considered missing a factor as a big deal. It only means > some wasted effort running a LL test that could have been avoided. I wonder if by using configuration settings, people are able to skip as many, if not all, factoring stages? I believe the idea of trying to skip P-1 factoring was talked about within the last 3 or 4 months. Apparently there are people who would just prefer to get credit for doing LL work than to find factors. Until factoring time (while one is LL testing) is at credited at the same same rate as LL testing, let alone getting credited not at all (most of the time no factors are found), results like the above could become common place. _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
-Original Message- From: [EMAIL PROTECTED] <[EMAIL PROTECTED]> To: [EMAIL PROTECTED] <[EMAIL PROTECTED]> Date: Tuesday, October 02, 2001 3:44 PM Subject: Re: FW: Mersenne: Re: Factoring Failure? > > Either way, GIMPS > > has never considered missing a factor as a big deal. It only means > > some wasted effort running a LL test that could have been avoided. >True enough - though I'm concerned that the "no factors below 2^N" >database may be seriously flawed, from the point of view of GIMPS >it would seem to be a waste of time to go round redoing trial >factoring just to fix this problem. Yes, from the point of view of GIMPS (that is, searching for Mersenne primes) it's not a huge deal... but there also exists an effort to fully factor the candidates that are not prime, and this throws a big problem into that project. Someone could be trial factoring an exponent from 2^59 to 2^65 and find a factor in that range after a smaller factor had been missed, and it will go into the database as the smallest factor when it actually is not. Might be decades before the smaller factor is discovered. Oh well, Steve _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
If we could indeed track these to a single user, I've got about 25-30 AMD 1.2 GHz processors that I could throw at the situation for a short time, just to quickly re-trial factor these and put our minds to rest. Aaron - Original Message - > However if it could be established that all the "missed" factors > reported were the work of one user, perhaps it would be worth fixing > the database to force rerunning of trial factoring for those factoring > assignments run by that user when the exponents are reassigned > for double checking (or LL testing). > > Regards > Brian Beesley _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
Re: FW: Mersenne: Re: Factoring Failure?
On 1 Oct 2001, at 22:23, Jean-Yves Canart wrote: > I have browsed some logs I archived long time ago and I have found > this: > > In may 1998, one user, "tomfakes", cleared around 80 exponents with > factor found = "1" It was in the range 7013000-7055000. Well, (s)he's not lying - 0 = n (mod 1) is a property of integers ;-) In any event: (a) this is the _opposite_ of the reported problem - what seems to have happened is that "no factor found" was being reported, sometimes erroneously; (b) this won't get through now PrimeNet validates submitted factors; the code I wrote for this purpose rejects as garbage any single-digit factor, after stripping off any leading zeroes as well as white space. (Obviously a Mersenne number with a prime exponent p > 5 cannot have any factors less than 10, and we know pretty much all there is to know about exponents up to and including 5, so excluding these is not a practical problem). > > At 04:07 PM 9/30/2001 -0700, Daniel Swanson wrote: > > >I went through the Cleared Exponents > > >report looking for other examples of factors found during > > double-checks that > > >should have been found during the initial factorization. > > > 5977297 53 DF6726544627832489 > > > 6019603 57 DF 137024179940485697 > > > 7019297 57 DF 160100125459121849 > > > 7020641 58 DF 226230108157229263 > > > 7025987 56 DF 74052063365823791 > > > 7027303 55 DF 31090234297428433 > > >10159613 56 DF 68279769831982367 > > >Were numbers in this range all originally factored by the same user > > >or computer? > > > > My logfiles from that long ago have been zipped and stored on CDROM. > > It is possible that 7,010,000 - 7,030,000 were all factored by one > > person. It was not uncommon for me to hand out large blocks for > > factoring to users without Internet connections. While I no longer > > do this, there are a handful of users pre-factoring the 20,000,000 - > > 80,000,000 area. I hope their machines are reliable!! They > > probably are as they are finding the expected number of factors. The primes from that block of 20,000 numbers represents quite a bit of work and maps poorly onto the "missed" factors reported. A few mistakes are inevitable but, since testing a factor takes of the order of a microsecond on current systems, hardware glitches shouldn't be much of a risk. (? Unless they get into the code stream used to generate potential factors?) Reports of two "missed" factors of exponents within spitting distance of 6,000,000 and no less than four just over 7,000,000 looks high for random glitches to be responsible, even on really ropy hardware. Remember that P-1 (which found the factors missed by trial factoring) can only find a small proportion of the "small" factors, especially when it's being run with "double checking" limits. > > > > Anyway, it doesn't appear to be a program bug as you were able to > > find the factor with trial factoring. I'm guessing either bad > > hardware or an older prime95 version had a bug. If it _was_ Prime95. There are other factoring programs out there; maybe there was a higher incidence of use about 3.5 years ago when these exponents would have been the subject of factoring assignments. > > Either way, GIMPS > > has never considered missing a factor as a big deal. It only means > > some wasted effort running a LL test that could have been avoided. True enough - though I'm concerned that the "no factors below 2^N" database may be seriously flawed, from the point of view of GIMPS it would seem to be a waste of time to go round redoing trial factoring just to fix this problem. However if it could be established that all the "missed" factors reported were the work of one user, perhaps it would be worth fixing the database to force rerunning of trial factoring for those factoring assignments run by that user when the exponents are reassigned for double checking (or LL testing). Regards Brian Beesley _ Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers