Frank Atanassow <[EMAIL PROTECTED]> wrote,
> George Russell writes:
> > Marcin 'Qrczak' Kowalczyk wrote:
> > > As for the language standard: I hope that Char will be allowed or
> > > required to have >=30 bits instead of current 16; but never more than
> > > Int, to be able to use ord and chr
Tue, 16 May 2000 12:26:12 +0200 (MET DST), Frank Atanassow <[EMAIL PROTECTED]> pisze:
> Of course, you can always come up with specialized schemes involving stateful
> encodings and/or "block-swapping" (using the Unicode private-use areas, for
> example), but then, that subverts the purpose of Un
Tue, 16 May 2000 10:44:28 +0200, George Russell <[EMAIL PROTECTED]> pisze:
> > As for the language standard: I hope that Char will be allowed or
> > required to have >=30 bits instead of current 16; but never more than
> > Int, to be able to use ord and chr safely.
>
> Er does it have to? The J
George Russell writes:
> Marcin 'Qrczak' Kowalczyk wrote:
> > As for the language standard: I hope that Char will be allowed or
> > required to have >=30 bits instead of current 16; but never more than
> > Int, to be able to use ord and chr safely.
> Er does it have to? The Java Virtual Mach
> > OTOH, it wouldn't be hard to change GHC's Char datatype to be a
> > full 32-bit integral data type.
>
> Could we do it please?
>
> It will not break anything if done slowly. I imagine that
> {read,write}CharOffAddr and _ccall_ will still use only 8 bits of
> Char. But after Char is wide, lib
Marcin 'Qrczak' Kowalczyk wrote:
> As for the language standard: I hope that Char will be allowed or
> required to have >=30 bits instead of current 16; but never more than
> Int, to be able to use ord and chr safely.
Er does it have to? The Java Virtual Machine implements Unicode with
16 bits.