On Sun, Oct 24, 1999 at 10:03:45PM +0100, Brian J. Beesley wrote:
>Have you any idea of the amount of CPU time needed to convert a 10 
>million bit binary number to a 3 million digit decimal number?

Yes, but you don't need the entire number, do you? Collecting the
low 64 bits doesn't take _that_ much time. The entire number, though,
would just be weird. Try looking at a million-digit number, getting
something useful out of it in the split second the iteration takes :-)

/* Steinar */
-- 
Homepage: http://members.xoom.com/sneeze/
_________________________________________________________________
Unsubscribe & list info -- http://www.scruz.net/~luke/signup.htm
Mersenne Prime FAQ      -- http://www.tasam.com/~lrwiman/FAQ-mers

Reply via email to