On Jun 4, 2010, at 5:45 PM, Hans Aberg wrote: > Hexadecimal representation is only used to give a compact representation of > binary numbers in connection of computers. In view of modern fast computers, > one only needs to write out numbers when interfacing with humans. Then one > can easily make the computer write or read what humans are used to. So there > is no particular need to switch to another base than ten if that is what > humans prefer. Base 16 is easier when one for some reason needs to think > about the binary representation.
And that need is much less than it was, say, 40 years ago. We don't normally debug from simple dumps anymore. -- John W Kennedy If Bill Gates believes in "intelligent design", why can't he apply it to Windows?

