> From a practical standpoint, I think it is more likely that the base will
> change rather than the hex characters.
> After all, digits have been constant for a long time, but the base has
> changed. Initially it was binary, then it was octal, and now hex
> arithmetic is
> common.

No, first it was binary, then it was binary and now its binary. Different
human-readable formats have been (and continue to be) used to represent
this.

 It seems more likely to me that we might switch to
> another base (32?
> 64?) as platforms expand, before we started adding redundant
> characters to hex
> arithmetic.

What human-readability advantages (the only reason we use hex) would base 32
or base 64 representations have over hex? They aren't matched by a nice
number of bits for most systems; the reason for using hex rather than octal
is that 2 hex digits can exactly represent the range of a octet (the most
common size of bytes these days) and by extension of any word composed of an
integral number of octets. The next base to have that quality is base 256,
which would require us to ransack a few different alphabets and then maybe
create a few symbols in order for us to represent it.


Reply via email to