As far as I know overclocking is not done to the extent that it was in the past, when with proper cooling some processors would operate at a signifigant percentage increase in speed. New processors operate in the microwave range when you look at their frequencies. You can imagine wierd things that may happen in a chip when you go from 3.2 Ghz to 4.2, strange things with induction and capacitance of the traces, a signal on one trace jumping to another as an RF signal, it just doesn't seem to me to be worth the effort to push a chip beyond it's rated speed anymore. Also the bottleneck is not so much the processor, but memory, chipset and graphics card have a large impact on how fast that game plays. It is now more work, for less benefit to overclock.

Brian J. Beesley wrote:

On Friday 16 January 2004 06:10, Max wrote:


It would be also interesting to learn how often the first run is bad, and
how often is the second?



Yes - I don't think this information is readily available, though sometimes you can infer the order of completion from the program version number.


To do the job properly either the "bad" database would need an extra field (date of submission) or a complete set of "cleared.txt" files would be required - and this would miss any results submitted manually.


It seems to me that first run should be bad more often than the second. Is
that true? My reasoning is that first run is usually done on modern
(fast/overclocked/unstable/etc) hardware while the second one is done on
the old/slow but more stable/trusted hardware.



Interesting theory - but surely the error rate would be expected to be proportional to the run length, which would tend to make fast hardware appear to be relatively more reliable - conversely smaller / lower power components (required to achieve high speed) would be more subject to quantum tunnelling errors. For those who think in terms of cosmic rays, this means a less energetic particle hit will be enough to flip the state of a bit.


In any case the exponents ~10,000,000 which are being double checked now were originally tested on "leading edge" hardware about 4 years ago, when overclocking was by no means unknown but was often done without the sort of sophisticated cooling which is readily available these days.

Regards
Brian Beesley
_________________________________________________________________________
Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers
_________________________________________________________________________
Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers




_________________________________________________________________________ Unsubscribe & list info -- http://www.ndatech.com/mersenne/signup.htm Mersenne Prime FAQ -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to