On Tuesday 08 June 2010 02:43:22 pm John Dlugosz wrote: > So it's only history that some glyphs used as digits are separate and > others (for Computer Science work anyway) are not. In practice, we don't > need unique assignments, in general. There are characters that are used > in numeric literals and they are a subset of those used for words in > general.
I see this as saying that we don't need HTML, XML, or any other content- describing formats, and arguing that we should stick to a format that merely describes the appearance (such as PDF or Postscript) since the meaning can be implied from how it appears. Assuming you don't actually believe that-- why should it be any different on the character level? Finally, there are in fact rendering differences between O and 0 (or else nobody would understand your history lacking the words "letter", "number", and "symbol"), and it is plausable if the tonal system were encoded that a font might like to give visual hints as to the difference between a decimal 2 and a tonal 2.

