On Sun, Apr 14, 2013 at 2:56 AM, Random832 <random...@fastmail.us> wrote:
> Okay, but why not work with a unicode code point as an int?
>


-1 from me.
It is utter madness to waste 32 (64 on x86_64) bits for a single
glyph. According to a quick google those chars can become as wide as 6
bytes, and believe me you don't want that, as long as there are
mblen(3) / mbrlen(3)...

cheers!
mar77i

Reply via email to