Mersenne Digest        Sunday, January 23 2000        Volume 01 : Number 683




----------------------------------------------------------------------

Date: Sat, 22 Jan 2000 11:35:38 -0800
From: Gerry Snyder <[EMAIL PROTECTED]>
Subject: Mersenne: Best chance to make a "real" contribution?

Hello, Mersenners;

First of all, let me say that it is a thrill to be helping in the GIMPS.
Getting more computing power for the search is the main reason my new
linux system is a dual Celeron rather than a single.

If all I ever do is LL tests of non-primes, that will be fine. That
helps the cause, and I have no expectation of finding a prime.

But finding a factor (or another factor) of a Mersenne number would seem
more real.

Is there any significant probability that two 500 MHz Celerons and one
333 MHz Celeron could accomplish such a feat in a couple of years?

Just thought I'd ask.

Thanks for any help,

Gerry
- -- 
mailto:[EMAIL PROTECTED]
Gerry Snyder, AIS Symposium Chair
Region 15 Ass't RVP, JT Chair
Member San Fernando Valley, Southern California Iris Societies
in warm, winterless Los Angeles
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sat, 22 Jan 2000 15:14:51 -0500
From: George Woltman <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Best chance to make a "real" contribution?

Hi Gerry,

At 11:35 AM 1/22/00 -0800, Gerry Snyder wrote:
>First of all, let me say that it is a thrill to be helping in the GIMPS.

Welcome aboard!

>Getting more computing power for the search is the main reason my new
>linux system is a dual Celeron rather than a single.

The second processor is a cheap way to add horsepower.  It will help
on many other computing tasks.  An excellent decision.

>But finding a factor (or another factor) of a Mersenne number would seem
>more real.

Finding new factors isn't hard.  Over half of the candidates are eliminated
by finding a factor rather than the expensive LL test.  GIMPS by default
assigns slower machines to do the factoring work.  Thus, it is not
uncommon for powerful machines to always get LL assignments and
never find a factor.

If you want the thrill of finding a factor, ask one of your CPUs to get
factoring work only (the Test/Primenet dialog box).  You can change this
setting back at a later time.

Finding new factors of small Mersennes, so called Cunningham factors, is
getting more difficult.  ECMNet and GIMPS have picked off most of the 
"easy" factors.  I have two CPUs running ECM full-time.  The last 
Cunningham factor I found was last summer.  I do occasionally find new
factors of medium-sized (1200 to 100000) Mersenne numbers.

In any event, choose the type of work you find most interesting.
The goal here is to have fun while contributing to math research.

Best regards,
George

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sat, 22 Jan 2000 15:44:16 -0500
From: "St. Dee" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Best chance to make a "real" contribution?

At 15:14 01/22/2000 -0500, George Woltman wrote:
>
>Finding new factors isn't hard.  Over half of the candidates are eliminated
>by finding a factor rather than the expensive LL test.  GIMPS by default
>assigns slower machines to do the factoring work.  Thus, it is not
>uncommon for powerful machines to always get LL assignments and
>never find a factor.

This brings up something I've been wondering about.  I have a dual Celeron
setup running 2 instances of mprime under Linux.  With both processors
crunching on LL tests, I get iteration times for each processor of around
.263 for exponents around 8990000 (where they are presently cranking).
However, if one of them factors while the other does LL testing, the
processor doing the LL testing takes about .220 seconds per iteration,
while the one factoring also shows a factoring speed more consistent with
the speed I would expect for the processor speed.

My question is:  Do I get more "work" done by having both doing LL testing,
or would this box contribute more to the effort by having one CPU
performing factoring while the other does LL testing?

Thanks!
Kel, coming up on 4 years of hunting for Mersenne primes
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sat, 22 Jan 2000 17:24:22 -0500
From: Foghorn Leghorn <[EMAIL PROTECTED]>
Subject: Mersenne: Factoring beyond ECM

I'm interested in trying to factor composite numbers with 100 to 200
digits. ECM becomes impractical for numbers without any factors below
50 digits or so. I have heard of algorithms such as MPQS which are
used to tackle larger numbers. Are there any (preferably free)
implementations of this method (or another) that would be feasible to
run on a home PC or Unix workstations?

Foghorn Leghorn
[EMAIL PROTECTED]
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sat, 22 Jan 2000 19:41:15 EST
From: [EMAIL PROTECTED]
Subject: Mersenne: Demonstration of LL test for M7

<< A while ago someone posted a demonstration of the Lucas-Lehmer test
 for, I think, 2^7-1. >>

LL test:
S[0] = 4
S[k+1] = S[k] ^2 - 2  mod 2^P - 1
2^P - 1 is prime if and only if S[P-2] = 0.
Demonstration for 2^7 - 1 = 127:
S[0] = 4
S[1] = 4^2 - 2 mod 127 = 14
S[2] = 14^2 - 2 mod 127 = 194 mod 127 = 67
S[3] = 67^2 - 2 mod 127 = 4487 mod 127 = 42
S[4] = 42^2 - 2 mod 127 = 1762 mod 127 = 111
S[5] = 111^2 - 2 mod 127 = 12319 mod 127 = 0
S[7-2] = S[5] = 0, hence 2^7 - 1 is prime.

QED
YEA
STL
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 02:06:26 +0100 (CET)
From: Henrik Olsen <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Factoring beyond ECM

On Sat, 22 Jan 2000, Foghorn Leghorn wrote:
> I'm interested in trying to factor composite numbers with 100 to 200
> digits. ECM becomes impractical for numbers without any factors below
> 50 digits or so. I have heard of algorithms such as MPQS which are
> used to tackle larger numbers. Are there any (preferably free)
> implementations of this method (or another) that would be feasible to
> run on a home PC or Unix workstations?
MPQS is ok for numbers up to about 100 digits, at which time NFS takes
over.

Have a look at Conrad Curry's NFSNET, 
 http://orca.st.usm.edu/~cwcurry/nfs/nfs.html>

> Foghorn Leghorn
> [EMAIL PROTECTED]

- -- 
Henrik Olsen,  Dawn Solutions I/S       URL=http://www.iaeste.dk/~henrik/
 Thomas Daggert to Lucifer:
  I have my soul, and I have my faith.  What do you have...  angel?
                                                             The Prophecy


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sat, 22 Jan 2000 22:11:44 -0500
From: Foghorn Leghorn <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Factoring beyond ECM

On Sun, 23 Jan 2000 02:06:26 +0100 (CET), you wrote:
>MPQS is ok for numbers up to about 100 digits, at which time NFS takes
>over.

Is there a good implementation of this available online?

>Have a look at Conrad Curry's NFSNET, 
> http://orca.st.usm.edu/~cwcurry/nfs/nfs.html>

Foghorn Leghorn
[EMAIL PROTECTED]
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 11:01:38 -0000
From: "Brian J. Beesley" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Best chance to make a "real" contribution?

On 22 Jan 00, at 11:35, Gerry Snyder wrote:

> But finding a factor (or another factor) of a Mersenne number would seem
> more real.
> 
> Is there any significant probability that two 500 MHz Celerons and one 333
> MHz Celeron could accomplish such a feat in a couple of years?

Depends where you look - last summer I found tens of thousands of 
smallish factors of Mersenne numbers in the 10 million digit range in 
a few minutes on a PII-350. On the other hand, many, many CPU years 
have been spent by various people trying to find any factor of the 
relatively small composite Mersenne number 2^727-1 without success.

You get kudos in proportion to the "difficulty" of the task you 
achieve (essentially this depends on luck, though you can help it 
along by supplying as much "horsepower" as possible). However, so far 
as I'm concerned, the main idea is to have fun.

Regards
Brian Beesley
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 11:01:38 -0000
From: "Brian J. Beesley" <[EMAIL PROTECTED]>
Subject: Dual CPU usage (was: Re: Mersenne: Best chance to make a "real" contribution?)

On 22 Jan 00, at 15:44, St. Dee wrote:

> This brings up something I've been wondering about.  I have a dual Celeron
> setup running 2 instances of mprime under Linux.  With both processors
> crunching on LL tests, I get iteration times for each processor of around
> .263 for exponents around 8990000 (where they are presently cranking).
> However, if one of them factors while the other does LL testing, the
> processor doing the LL testing takes about .220 seconds per iteration,
> while the one factoring also shows a factoring speed more consistent with
> the speed I would expect for the processor speed.

The "problem" here is that the two processors are sharing access to 
the memory bus. Trial factoring actually uses very little memory 
access (actually it runs more or less from the L1 cache!) whereas LL 
testing really does thrash the memory subsystem (unless you have a 
Xeon with 2MB L2 cache, and even then only if you're testing an 
exponent less than 10,320,000).

I presume you have BX chipset and PC100 memory. Even so, asking for 
twice the 66 MHz bandwidth of each Celeron CPU is going to congest 
the memory subsystem, which explains the slowdown.

However, the problem is not unique to dual Celeron systems. Unless 
you have a multi-processor system with a seperate memory bus for each 
processor - and some means of intercommunication e.g. multiporting - 
the same problem is going to strike. To the best of my knowledge, 
only very expensive server motherboards use this technology. (Compaq 
did have a dual CPU workstation MB based on this technology but they 
seem to have dropped it when 100 MHz systems came in.)

For systems using large clock multipliers, increasing the throughput 
of the memory bus will help mprime/Prime95 performance substantially
- - even for uniprocessor systems. In my experience, systems running 
the VIA chipset using PC133 memory are very disappointing - my PIII-
533 is turning in times very similar to the PII-400 benchmarks. I get 
the impression that there is a 100 MHz bottleneck inside the chipset, 
even when CPU and memory buses are both set to 133 MHz. Or it's 
actually driving the CPU at 100 MHz irrespective of the board jumpers 
and setup.

Intel's Rambus technology looks interesting from this point of view - 
and boards which use it are starting to filter on to the market. A 
practical problem is that Rambus memory is extortionately expensive - 
at least 4 times the price of SDRAM, and you can't even get modules 
smaller than 64 MB. Intel recently released a version of some boards 
using the 820 chipset (designed for Rambus) which can take 133 MHz 
SDRAM, but there is a "gearbox" between the SDRAM and the chipset 
which reduces the memory bus throughput compared with the native 
Rambus version - though whether it's any worse than you'd expect 
projecting from the benchmarks based on a 100 MHz BX chipset is 
unknown.
> 
> My question is:  Do I get more "work" done by having both doing LL
> testing, or would this box contribute more to the effort by having one CPU
> performing factoring while the other does LL testing?

Personally I think you get better value by running 1 LL test & 1 
"something else" which doesn't make big claims on the memory bus. ECM 
factoring is quite light on the memory bus (though "larger" exponents 
i.e. above about 2,000 might have difficulty fitting well into the 
Celeron L2 cache, which is only 128K) and might make an interesting 
alternative to trial factoring.

There are a considerable number of exponents which have been double-
checked but have not been trial-factored to the full depth suggested 
by v19, running some of these would find a number of "first factors", 
though I don't think PrimeNet would credit you with the CPU time.

Factoring (of any kind) will not find primes; trial factoring larger 
exponents which will be LL tested later will eliminate some 
candidates, and be credited by PrimeNet (if you use the "automatic" 
method of reporting assignment results); other types of factoring 
would be done out of pure interest, to find new or first factors 
only.

Also, the PrimeNet rankings for LL testing & factoring are seperate 
rather than cumulative. If you split your work, your LL testing 
effort will be reduced compared with what it would be if you ran LL 
tests on both CPUs. As a compensation, you would zoom up the 
factoring rankings!

Regards
Brian Beesley
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 14:40:40 +0100
From: "Hans-Martin Anger" <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Factoring beyond ECM

LiDIA is a free package for long number arithmetic.
It includes a demo-program for factoring numbers with trial factoring, ECM
and MPQS successive.
See here: http://www.informatik.tu-darmstadt.de/TI/LiDIA/Welcome.html

regards
Martin


- -----Urspr�ngliche Nachricht-----
Von: Foghorn Leghorn <[EMAIL PROTECTED]>
An: <[EMAIL PROTECTED]>
Gesendet: Samstag, 22. Januar 2000 23:24
Betreff: Mersenne: Factoring beyond ECM


> I'm interested in trying to factor composite numbers with 100 to 200
> digits. ECM becomes impractical for numbers without any factors below
> 50 digits or so. I have heard of algorithms such as MPQS which are
> used to tackle larger numbers. Are there any (preferably free)
> implementations of this method (or another) that would be feasible to
> run on a home PC or Unix workstations?
>
> Foghorn Leghorn
> [EMAIL PROTECTED]
> _________________________________________________________________
> Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
> Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers
>

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 14:28:47 -0000
From: Alex Phillips <[EMAIL PROTECTED]>
Subject: Mersenne: Odds on finding a factor ?

Dear List-reader,
                I've been running Prime95 on my PII-400 at work, since December, and 
I'm 
currently on my second LL test !
And I've also been running it on my Celeron366 Laptop at home (When my wife 
isn't playing Settlers 3). I decided to make the Laptop do Factoring, and 
I've factored five numbers, all in the 11650000-11660000 range, as 
allocated by Primenet, without finding a factor.
        So my question is, What are the odds on finding a factor ?

        I've looked in all the FAQ's, including the mailing list one, and can't 
find it anywhere. I know that the odds for a LL test are 1 in 60000, but I 
can't find out what the odds on finding a factor are ? Presumably all the 
non-prime numbers have a factor so the odds should be 1 in 59999, but most 
of these factors must be higher than 2^64 (Yes/No) ? So what are the odds 
on a factor that prime95 will find ?




Alex Phillips

Campaign for Unmetered Telecommunications
http://www.unmetered.org.uk
Campaigning for a fair choice in telecommunications

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 16:23:59 +0100 (CET)
From: Henrik Olsen <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Odds on finding a factor ?

On Sun, 23 Jan 2000, Alex Phillips wrote:
> Dear List-reader,
>               I've been running Prime95 on my PII-400 at work, since December, and 
>I'm 
> currently on my second LL test !
> And I've also been running it on my Celeron366 Laptop at home (When my wife 
> isn't playing Settlers 3). I decided to make the Laptop do Factoring, and 
> I've factored five numbers, all in the 11650000-11660000 range, as 
> allocated by Primenet, without finding a factor.
>       So my question is, What are the odds on finding a factor ?
> 
>       I've looked in all the FAQ's, including the mailing list one, and can't 
> find it anywhere. I know that the odds for a LL test are 1 in 60000, but I 
> can't find out what the odds on finding a factor are ? Presumably all the 
> non-prime numbers have a factor so the odds should be 1 in 59999, but most 
> of these factors must be higher than 2^64 (Yes/No) ? So what are the odds 
> on a factor that prime95 will find ?
>
> Alex Phillips
>From a quick browse through the top 101-500 producer list (it's the one
I'm in:) it looks like the odds say you can expect 10-15 factors per P90
year spent on factoring.

- -- 
Henrik Olsen,  Dawn Solutions I/S       URL=http://www.iaeste.dk/~henrik/
 Lord Mhoram: Thomas Covenant, you are the savior of The Land.
 Thomas Covenant: Bite me.              (Thomas Covenant saves The Land.)
     The Chronicles of Thomas Covenant, Unbeliever; Book-A-Minute version


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 10:08:41 -0700
From: "Aaron Blosser" <[EMAIL PROTECTED]>
Subject: RE: Mersenne: Odds on finding a factor ?

> From a quick browse through the top 101-500 producer list (it's the one
> I'm in:) it looks like the odds say you can expect 10-15 factors per P90
> year spent on factoring.

Based on my own stats, I've got 13.959 P90 years spent factoring, with 177
factors found.  That's 12-13 per P90 year, so you're estimate fits in my
case at least.  Currently, I'm #8 in the factoring, but slipping fast. :-)

My quick checks on the top 10 factorers show that *usually*, the ratio is
the same, but for some of those top factorers, the ratios are much higher
(20-23 per P90 year).  I would guess that since those folks concentrate more
on factoring, they got some numbers that were not tested even up to 52-55
bits yet, so probably found more small factors than the rest of us will.

Aaron

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 14:12:02 -0500
From: George Woltman <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Odds on finding a factor ?

Hi,

At 02:28 PM 1/23/00 -0000, Alex Phillips wrote:
>I've factored five numbers, all in the 11650000-11660000 range, as 
>allocated by Primenet, without finding a factor.
>       So my question is, What are the odds on finding a factor ?

Since these exponents are already factored to 2^52 and you will be 
factoring to 2^64, your chance of NOT finding a factor is about 52/64.
Thus, your chance of finding a factor is about 1 in 5.3.

Regards,
George


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 15:48:20 -0500
From: Pierre Abbat <[EMAIL PROTECTED]>
Subject: Mersenne: Size of largest prime factor

If I pick a huge number n at random, how much smaller than n, on average, is
its largest prime factor?

phma
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 17:00:06 -0500
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Size of largest prime factor

At 03:48 PM 1/23/00 -0500, Pierre Abbat wrote:
>If I pick a huge number n at random, how much smaller than n, on average, is
>its largest prime factor?

On the average, the largest prime factor of n is n^0.6065, and the second 
largest is n^0.2117.  Reference: Knuth, the Art of Computer Programming, 
vol 2, section 4.5.4.

+--------------------------------------------------------+
|                  Jud McCranie                          |
|                                                        |
| 137*2^197783+1 is prime!  (59,541 digits, 11/11/99)    |
| 137*2^224879+1 is prime!  (67,687 digits, 1/00)        |
+--------------------------------------------------------+

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 16:36:11 -0600 (CST)
From: Conrad Curry <[EMAIL PROTECTED]>
Subject: Mersenne: NFSNET

On Sat, 22 Jan 2000, George Woltman wrote:

> Finding new factors of small Mersennes, so called Cunningham factors, is
> getting more difficult.  ECMNet and GIMPS have picked off most of the 
> "easy" factors.  I have two CPUs running ECM full-time.  The last 
> Cunningham factor I found was last summer.  I do occasionally find new
> factors of medium-sized (1200 to 100000) Mersenne numbers.


  If finding factors by ECM is becoming more difficult we need to start
factoring these numbers by NFS.  Even if a factor is found by ECM it
may not completely factor the number.  Unlike ECM, NFS is guaranteed
to factor the number.

  Of the Mersenne numbers, NFSNET is doing M629 and M641.  M641 has
a 148 digit composite cofactor.  Enough ECM has been done to make it
unlikely to have a factor of less than 45 digits.  NFS can't take
advantage of the known primitive factors, it must be used on the 
whole 193 digit number.  It will be the second most difficult SNFS
factorization done and is one of the Cunningham projects most wanted.

  There are many "easier" 2^p+1 numbers that NFS can factor.  It
would be more productive to use NFS than ECM on these numbers.

  The NFSNET page is at http://orca.st.usm.edu/~cwcurry/nfs/nfs.html
We are currently working on M641 and 2,1542M (a factor of 2^1542+1).


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 16:51:27 -0600
From: "Kyle Evans" <[EMAIL PROTECTED]>
Subject: RE: Mersenne: Size of largest prime factor

But (assuming n is composite) no prime factor of n can be greater than
n^0.5.  So how can n^0.6065 be the average?

(I hope I'm not showing my idiocy here!  :)

Kyle Evans (newbie on this list)

- -----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Jud
McCranie
Sent: Sunday, January 23, 2000 4:00 PM
To: Pierre Abbat
Cc: [EMAIL PROTECTED]
Subject: Re: Mersenne: Size of largest prime factor


At 03:48 PM 1/23/00 -0500, Pierre Abbat wrote:
>If I pick a huge number n at random, how much smaller than n, on average,
is
>its largest prime factor?

On the average, the largest prime factor of n is n^0.6065, and the second
largest is n^0.2117.  Reference: Knuth, the Art of Computer Programming,
vol 2, section 4.5.4.

+--------------------------------------------------------+
|                  Jud McCranie                          |
|                                                        |
| 137*2^197783+1 is prime!  (59,541 digits, 11/11/99)    |
| 137*2^224879+1 is prime!  (67,687 digits, 1/00)        |
+--------------------------------------------------------+

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 16:58:54 -0600
From: "Kyle Evans" <[EMAIL PROTECTED]>
Subject: RE: Mersenne: Size of largest prime factor

Never mind.  My idiocy is admitted.  :)

It's the LOWEST prime factor that can't exceed n^0.5.

Sorry for the trouble.

Kyle E.

- -----Original Message-----
From: Kyle Evans [mailto:[EMAIL PROTECTED]]
Sent: Sunday, January 23, 2000 4:51 PM
To: Jud McCranie; Pierre Abbat
Cc: [EMAIL PROTECTED]
Subject: RE: Mersenne: Size of largest prime factor


But (assuming n is composite) no prime factor of n can be greater than
n^0.5.  So how can n^0.6065 be the average?

(I hope I'm not showing my idiocy here!  :)

Kyle Evans (newbie on this list)

- -----Original Message-----
From: [EMAIL PROTECTED]
[mailto:[EMAIL PROTECTED]]On Behalf Of Jud
McCranie
Sent: Sunday, January 23, 2000 4:00 PM
To: Pierre Abbat
Cc: [EMAIL PROTECTED]
Subject: Re: Mersenne: Size of largest prime factor


At 03:48 PM 1/23/00 -0500, Pierre Abbat wrote:
>If I pick a huge number n at random, how much smaller than n, on average,
is
>its largest prime factor?

On the average, the largest prime factor of n is n^0.6065, and the second
largest is n^0.2117.  Reference: Knuth, the Art of Computer Programming,
vol 2, section 4.5.4.

+--------------------------------------------------------+
|                  Jud McCranie                          |
|                                                        |
| 137*2^197783+1 is prime!  (59,541 digits, 11/11/99)    |
| 137*2^224879+1 is prime!  (67,687 digits, 1/00)        |
+--------------------------------------------------------+

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers


_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 18:14:39 -0500 (EST)
From: Lucas Wiman  <[EMAIL PROTECTED]>
Subject: Re: Mersenne: Size of largest prime factor

> But (assuming n is composite) no prime factor of n can be greater than
> n^0.5.  So how can n^0.6065 be the average?
> 
> (I hope I'm not showing my idiocy here!  :)

No, that's not correct.  If n is composite, then it *must have* a prime
factor <n^.5, but it can (though not always) have one larger
e.g. 217=7*31
sqrt(217)~=14.73>7, but 31>14.73.

- -Lucas Wiman
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 18:37:37 -0500
From: Jud McCranie <[EMAIL PROTECTED]>
Subject: RE: Mersenne: Size of largest prime factor

At 04:51 PM 1/23/00 -0600, Kyle Evans wrote:
>But (assuming n is composite) no prime factor of n can be greater than
>n^0.5.  So how can n^0.6065 be the average?

I'm not assuming that n is composite.  Some of them are prime, and in that 
case the largest prime number is the number itself, and that brings up the 
average.


+--------------------------------------------------------+
|                  Jud McCranie                          |
|                                                        |
| 137*2^197783+1 is prime!  (59,541 digits, 11/11/99)    |
| 137*2^224879+1 is prime!  (67,687 digits, 1/00)        |
+--------------------------------------------------------+

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Sun, 23 Jan 2000 17:18:16 -0800
From: Stefan Struiker <[EMAIL PROTECTED]>
Subject: Mersenne: L-L Performance On PII/PIII XEON With 1M or 2M Cache

Team M:

Has there been a reported  run of MPrime on a PII/PIII XEON with at least 1M cache?
If so what, if any, was the improvement?
                                    Regards,
                                            Stefan S.

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Mon, 24 Jan 2000 11:51:55 +0800
From: "Dave Mullen" <[EMAIL PROTECTED]>
Subject: Re: Re: Mersenne: Odds on finding a factor ?

This is a multi-part message in MIME format.

- ------=_NextPart_000_0044_01BF6661.69A44F60
Content-Type: text/plain;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

If you're factoring numbers in the 11650000-11660000 (bit) range, the =
first factor could be anywhere in the root(11650000) - root(11660000) =
range i.e.  3413 - 3414 bits long !!

George's system prechecks to 2^52, and you are checking 2^52 - 2^64. =
There's still a long way from 2^64 to 2^3413 !!

I'd have thought your odds of finding a factor are a lot smaller than 12 =
/ 64 - probably closer to 12/3361 - and that's only if we pretend that =
the power series 2^n is a linear series he he he.

Ala UBASIC, I estimate your real odds might be 4096 /  =
3298942664324070148398699377093587963271453646320083409970227900099032340=
084550
6277491238625469065799538799279719128688693415406076675530724056704926101=
7842856
3475328531926353881306396247387898584576579784538205533741044054134428433=
9559168
8094187604203904814782826314141978074857426863892115554742133601202639910=
4278046
6426554289607050263290603052932269781896923601167447335692297055310057722=
2189744
0109820691297322345804322889319906896486033495988338089104740109486682720=
2436849
0646740428594424154449093239246655518006756085297104027875717092707788083=
5333805
6583984539214599795637132555384173747564573799652452887403289984970168150=
9368301
4276101501852528869037857173974587560310585388377180537945925678261478358=
0420623
5738598219493686864202568254212708316636876315010300743651926557528085451=
3045193
1636125256136601341096259794595447757151468551319326463387407612700784225=
7236863
9820904637280160765571585686650573179029307896177688750679529559616050766=
1321192
4545055443168148162127448741725540542483073606099134963276375451960934144=
1116629
8407300058885128192

Sorry to be the Grim Reaper, but I've spent months with UBASIC =
eliminating factors in the 32,000,000 to 48,000,000 range - I'm only on =
about 24% eliminated using multiples 2pk+1 where k is 1 to 2^16 - and =
there's no doubt that the density of factors decreases as the multiplier =
increases. Finding the first few % is easy - finding the last 1% might =
take forever !!

However I am using "DaveNET" - One P133 Laptop only, shared with the =
wife's chatting and E-Mail, and the kid's games !!

Dave

- ------=_NextPart_000_0044_01BF6661.69A44F60
Content-Type: text/html;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

<!DOCTYPE HTML PUBLIC "-//W3C//DTD W3 HTML//EN">
<HTML>
<HEAD>

<META content=3Dtext/html;charset=3Diso-8859-1 =
http-equiv=3DContent-Type><!DOCTYPE HTML PUBLIC "-//W3C//DTD W3 =
HTML//EN">
<META content=3D'"MSHTML 4.72.3110.7"' name=3DGENERATOR>
</HEAD>
<BODY bgColor=3D#ffffff>
<DIV>If you're factoring numbers in the 11650000-11660000 (bit) range, =
the first=20
factor could be anywhere in the root(11650000) - root(11660000) range =
i.e.&nbsp;=20
3413 - 3414 bits long !!</DIV>
<DIV>&nbsp;</DIV>
<DIV>George's system prechecks to 2^52, and you are checking 2^52 - =
2^64.=20
There's still a long way from 2^64 to 2^3413 !!</DIV>
<DIV>&nbsp;</DIV>
<DIV>I'd have thought your odds of finding a factor are a lot smaller =
than 12 /=20
64 - probably closer to 12/3361 - and that's only if we pretend that the =
power=20
series 2^n is a linear series he he he.</DIV>
<DIV>&nbsp;</DIV>
<DIV>Ala UBASIC, I estimate your real odds might be 4096 /&nbsp;=20
3298942664324070148398699377093587963271453646320083409970227900099032340=
084550<BR>627749123862546906579953879927971912868869341540607667553072405=
67049261017842856<BR>3475328531926353881306396247387898584576579784538205=
5337410440541344284339559168<BR>80941876042039048147828263141419780748574=
268638921155547421336012026399104278046<BR>642655428960705026329060305293=
22697818969236011674473356922970553100577222189744<BR>0109820691297322345=
8043228893199068964860334959883380891047401094866827202436849<BR>06467404=
285944241544490932392466555180067560852971040278757170927077880835333805<=
BR>6583984539214599795637132555384173747564573799652452887403289984970168=
1509368301<BR>42761015018525288690378571739745875603105853883771805379459=
256782614783580420623<BR>573859821949368686420256825421270831663687631501=
03007436519265575280854513045193<BR>1636125256136601341096259794595447757=
1514685513193264633874076127007842257236863<BR>98209046372801607655715856=
866505731790293078961776887506795295596160507661321192<BR>454505544316814=
81621274487417255405424830736060991349632763754519609341441116629<BR>8407=
300058885128192</DIV>
<DIV>&nbsp;</DIV>
<DIV>Sorry to be the Grim Reaper, but I've spent months with UBASIC =
eliminating=20
factors in the 32,000,000 to 48,000,000 range - I'm only on about 24% =
eliminated=20
using multiples 2pk+1 where k is 1 to 2^16 - and there's no doubt that =
the=20
density of factors decreases as the multiplier increases. Finding the =
first few=20
% is easy - finding the last 1% might take forever !!</DIV>
<DIV>&nbsp;</DIV>
<DIV>However I am using &quot;DaveNET&quot; - One P133 Laptop only, =
shared with=20
the wife's chatting and E-Mail, and the kid's games !!</DIV>
<DIV>&nbsp;</DIV>
<DIV>Dave</DIV></BODY></HTML>

- ------=_NextPart_000_0044_01BF6661.69A44F60--

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Mon, 24 Jan 2000 00:28:21 -0500 (EST)
From: Lucas Wiman  <[EMAIL PROTECTED]>
Subject: Re: Re: Mersenne: Odds on finding a factor ?

> If you're factoring numbers in the 11650000-11660000 (bit) range, the first factor 
>could be anywhere in the root(11650000) - root(11660000) range i.e.  3413 - 3414 bits 
>long !!

No, in the x-y bit range (remember that n bit integers are about 2^n) the
first factor could be x/2 to y/2 bits long (powers of a power multiply).

> I'd have thought your odds of finding a factor are a lot smaller than 12 / 64 - 
>probably closer to 12/3361 - and that's only if we pretend that the power series 2^n 
>is a linear series he he he.

see http://www.utm.edu/research/primes/glossary/MertensTheorem.html

> 
> Ala UBASIC, I estimate your real odds might be 4096 /  
>3298942664324070148398699377093587963271453646320083409970227900099032340084550
> [...]

Abreviate!  Use scientific notation...

> Sorry to be the Grim Reaper, but I've spent months with UBASIC eliminating factors 
>in the 32,000,000 to 48,000,000 range - I'm only on about 24% eliminated using 
>multiples 2pk+1 where k is 1 to 2^16 - and there's no doubt that the density of 
>factors decreases as the multiplier increases. Finding the first few % is easy - 
>finding the last 1% might take forever !!

Why are you only setting k==1 mod 2^16?
(I'm probably missing something obvious)

I think under windows that dos windows only run when they are "up".
(I could be wrong, I've stopped using windows again)
You would probably get better results with Will Edgington's mersfacgmp
program, and DJGPP (a port of g++ to dos).

- -Lucas
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

Date: Mon, 24 Jan 2000 15:01:44 +0800
From: "Dave Mullen" <[EMAIL PROTECTED]>
Subject: Mersenne: Re : Odd's on finding a factor (part 2)

This is a multi-part message in MIME format.

- ------=_NextPart_000_000E_01BF667B.EE20C9C0
Content-Type: text/plain;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

Sorry, I'm no mathematician, and new to the Mersenne field.

> No, in the x-y bit range (remember that n bit integers are about 2^n) =
the
first factor could be x/2 to y/2 bits long (powers of a power multiply).

What I was trying to say in my disjointed way was ...

(Example) M11 =3D 2047 (11 bits long). Now 2047 has only 2 factors (23 x =
89) and the square root of 2047 is approx 45. 45 is 6 bits long, =
therefore the factor lower than the square root must have <=3D 6 bits, =
and the factor higher than the square root must have >=3D6 bits.

23 is 5 bits long, and 89 is 7 bits long.

Thus for the exponent 11650000 bits long, if it only has 2 factors , =
then the first factor must be between 2 and 3413 bits long, and the =
second factor must be between 3413 and 11649999 bits long. Note that the =
bit lengths of the 2 factors added together must equal the bit length of =
the Prime (or bit length of the Prime + 1) !!

> Abreviate!  Use scientific notation...


Ooops.

> Why are you only setting k=3D=3D1 mod 2^16?
(I'm probably missing something obvious)

Yes, my failure to express myself properly - I meant I test each =
exponent with all "k" multipliers in the range 1 to 2^16.

> I think under windows that dos windows only run when they are "up".

Under Win 98, they chug along quite happily in the background, unless =
you're running some other DOS box in Full Screen mode (I think).

> You would probably get better results with Will Edgington's mersfacgmp
program, and DJGPP (a port of g++ to dos).


I'll check it out. I'm using UBASIC because for testing factors of large =
Mersenne, I don't need to actually represent the full Mersenne Prime. If =
you're familiar with UBASIC try ...

Result =3D MODPOW(2,MersenneExponent,TrialFactor)

where TrialFactor is the MersenneExponent * 2 * (some k in range 1 to =
2^16) + 1.

If Result =3D 1 then TrialFactor divides the Mersenne Prime. As UBASIC =
can handle around 2600 decimal digits, in theory (and with a lot of =
time), I could check all factors up to 2600 decimal digits for any given =
exponent. It's a damn sight faster than filling 16MB+ of memory with 1's =
and then trial dividing.

- ------=_NextPart_000_000E_01BF667B.EE20C9C0
Content-Type: text/html;
        charset="iso-8859-1"
Content-Transfer-Encoding: quoted-printable

<!DOCTYPE HTML PUBLIC "-//W3C//DTD W3 HTML//EN">
<HTML>
<HEAD>

<META content=3Dtext/html;charset=3Diso-8859-1 =
http-equiv=3DContent-Type>
<META content=3D'"MSHTML 4.72.3110.7"' name=3DGENERATOR>
</HEAD>
<BODY bgColor=3D#ffffff>
<DIV><FONT color=3D#000000 size=3D2>Sorry, I'm no mathematician, and new =
to the=20
Mersenne field.</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>&gt; No, in the x-y bit range (remember that n bit =
integers=20
are about 2^n) the<BR>first factor could be x/2 to y/2 bits long (powers =
of a=20
power multiply).</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>What I was trying to say in my disjointed way was=20
...</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>(Example) M11 =3D 2047 (11 bits =
long). Now 2047=20
has <U>only </U><U>2 factors</U> (23 x 89) and the square root of 2047 =
is approx=20
45. 45 is 6 bits long, therefore the factor lower than the square root =
must have=20
&lt;=3D 6 bits, and the factor higher than the square root must have =
&gt;=3D6=20
bits.</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT><FONT size=3D2>23 is 5 bits =
long, and 89 is=20
7 bits long.</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>Thus for the exponent 11650000 bits long, <U>if it =
only has 2=20
factors</U> , then the first factor must be between 2 and 3413 bits =
long, and=20
the second factor must be between 3413 and 11649999 bits long. Note that =
the bit=20
lengths of the 2 factors added together must equal the bit length of the =
Prime=20
(or bit length of the Prime + 1) !!</FONT></DIV>
<DIV><FONT size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>&gt; Abreviate!&nbsp; Use scientific =

notation...<BR></FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>Ooops.</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>&gt; Why are you only setting =
k=3D=3D1 mod=20
2^16?<BR>(I'm probably missing something obvious)</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>Yes, my failure to express myself =
properly - I=20
meant I test each exponent with all &quot;k&quot; multipliers in the =
range 1 to=20
2^16.</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>&gt; I think under windows that dos =
windows only=20
run when they are &quot;up&quot;.</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>Under Win 98, they chug along quite =
happily in=20
the background, unless you're running some other DOS box in Full Screen =
mode (I=20
think).</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>&gt; You would probably get better =
results with=20
Will Edgington's mersfacgmp<BR>program, and DJGPP (a port of g++ to=20
dos).<BR></FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>I'll check it out. I'm using UBASIC =
because for=20
testing factors of large Mersenne, I don't need to actually represent =
the full=20
Mersenne Prime. If you're familiar with UBASIC try ...</FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2><STRONG>Result =3D=20
MODPOW(2,MersenneExponent,TrialFactor)</STRONG></FONT></DIV>
<DIV><FONT color=3D#000000 size=3D2></FONT>&nbsp;</DIV>
<DIV><FONT size=3D2>where TrialFactor is the MersenneExponent * 2 * =
(some k in=20
range 1 to 2^16) + 1.</FONT></DIV>
<DIV>&nbsp;</DIV>
<DIV><FONT color=3D#000000 size=3D2>If Result =3D 1 then TrialFactor =
divides the=20
Mersenne Prime. As UBASIC can handle around 2600 decimal digits, in =
theory (and=20
with a lot of time), I could check all factors up to 2600 decimal digits =
for any=20
given exponent. It's a damn sight faster than filling 16MB+ of memory =
with 1's=20
and then trial dividing.</FONT></DIV></BODY></HTML>

- ------=_NextPart_000_000E_01BF667B.EE20C9C0--

_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

------------------------------

End of Mersenne Digest V1 #683
******************************

Reply via email to