On Friday 04 June 2010 04:45:57 pm Hans Aberg wrote:
> Hexadecimal representation is only used to give a compact
> representation of binary numbers in connection of computers. In view
> of modern fast computers, one only needs to write out numbers when
> interfacing with humans. Then one can easily make the computer write
> or read what humans are used to. So there is no particular need to
> switch to another base than ten if that is what humans prefer. Base 16
> is easier when one for some reason needs to think about the binary
> representation.

Base 16 is superior in many various ways, the most obvious being easier 
division (both visibly and numeric). Why assume all humans prefer the same 
thing? This is like assuming everyone knows the same language, uses the same 
characters, etc...

> But if humans in the future would use base 16 a lot, it might be
> convenient to have special symbols for them. Then the typical would be
> that glyphs becoming some alteration of A-F.

While it is natural for glyphs to change, artificial character sets are not 
unheard of. For example, Korean was designed such that each character, 
representing a syllable, was composed of sub-characters representing the 
individual sounds in that syllable. Despite its unnatural origin, numerous 
people use it in their daily lives.

Reply via email to