On 7/17/2011 12:19 PM, Doug Ewell wrote:
Asmus wrote:

The reason is, of course, because these codes would *reinterpret* existing 
characters. You could argue that Variation Selectors do the same, but they are 
carefully constructed so that they can be safely ignored.


Variation selectors don't change the interpretation of characters, only their 
visual appearance.



The process of display is part of the more general concept of "interpretation" as this term is used in the Unicode Standard.

A./

PS: and variation selectors don't necessarily even "change" the visual appearance of a character. If the glyph shape for the given character in the selected font already matches or falls into the glyphic subspace indicated by the variation sequence, then you would not observe any change. (Ditto for display processes that don't support variation selectors, but that's a whole different kettle of fish).

Reply via email to