On Friday 06 July 2001 10:13 am, Dan Sugalski wrote:
> I should point out that the internal representation of large numbers isn't
> going to be huge strings of ASCII characters--we'll probably be an array
> of 15-bit integers. (As Hong pointed out a while ago, doing that makes
> handling multiplication reasonably simple. Might go to arrays of 31-bit
> integers on 64-bit platforms) Though I might be misreading you here. (I
> probably am)
Actually, you *shouldn't* have to point that at. No, you weren't misreading
me, and Yes, Virginia, I am a fucking idiot. I can't even think of what I
may have been thinking of. We have talked about this before. I write the
damn summaries, for crying out loud!!!! Arggh!
--
Bryan C. Warnock
[EMAIL PROTECTED]