Max wrote:
On Thursday 15 January 2004 12:22, Brian J. Beesley wrote:A reason for that to be reversed would be random "cosmic ray" errors. A faster computer allows less time per exponent for an error to occur. Say such an error occurs about once every year per computer (I know this is to often, just an example) the slow computer that finishes 1 per year would have about 1 error per exponent, one that completes 10 in a year would have one error on average every 10 exponents.
>On Thursday 15 January 2004 01:00, Max wrote:
>> Is any statistix on double-check mismatches available? >> How often this happens?
>~2% of all runs are bad.
It would be also interesting to learn how often the first run is bad, and how often is the second?
It seems to me that first run should be bad more often than the second. Is that true?
My reasoning is that first run is usually done on modern (fast/overclocked/unstable/etc) hardware while the second one is done on the old/slow but more stable/trusted hardware.
Please correct me if I'm wrong.
Thanks, Max
_________________________________________________________________________ Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers
_________________________________________________________________________ Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers