It was the 90s, when 16 bits seemed enough. Wish we could go back. Even
in 1995 this was obviously going to fail, but the die had been cast
years earlier in Windows and Java APIs and language/implementation designs.
/be
Claude Pache wrote:
(So, taking your example, the đź’© character is internally represented as a
sequence of two 16-bit-units, not “characters”. And, very confusingly, the
String methods that contain “char” in their name have nothing to do with
“characters”.)
—Claude
_______________________________________________
es-discuss mailing list
es-discuss@mozilla.org
https://mail.mozilla.org/listinfo/es-discuss