* John Cowan
| 
| C1 says "A process shall interpret Unicode code values as 16-bit
| quantities."

This I find mightily confusing.  Why say something like this when
there are (well, will be) characters that cannot be represented with
16 bits in any of the Unicode encodings?

| "Code unit" is defined in definition D5 as a synonym for "code
| value".  If this needs updating,

Unless I've really misunderstood something it does need updating.

--Lars M.

Reply via email to