https://gcc.gnu.org/bugzilla/show_bug.cgi?id=95953

            Bug ID: 95953
           Summary: UTF Convert for UTF_16 to UTF_8 fails for values in
                    U+10000 to U+10FFFF.
           Product: gcc
           Version: 7.3.1
            Status: UNCONFIRMED
          Severity: normal
          Priority: P3
         Component: ada
          Assignee: unassigned at gcc dot gnu.org
          Reporter: trashgod at gmail dot com
  Target Milestone: ---

Created attachment 48798
  --> https://gcc.gnu.org/bugzilla/attachment.cgi?id=48798&action=edit
complete example that reproduces problem and tests fix

The library function Ada.Strings.UTF_Encoding.Conversions.Convert for UTF_16 to
UTF_8 fails for values in the range U+10000 to U+10FFFF. Byte three of the four
byte value is malformed; yyyyyyyy only needs to be shifted _two_ places left:

Result (Len + 3) :=
  Character'Val
    (2#10_000000# or Shift_Left (yyyyyyyy and 2#1111#, 2) -- originally 4
                  or Shift_Right (xxxxxxxx, 6));

A complete example, utftest.adb, is attached and discussed here:

https://stackoverflow.com/a/62605993/230513

The output shows the code point, the expected binary representation of the
UTF-8 encoding, the conversion to UTF-16, the incorrect UTF-8 conversion, and
the correct UTF-8 conversion.

$ gcc -v
…gcc version 7.3.1 20180524 (for GNAT Community 2018 20180523) (GCC)
$ gnatmake utftest.adb && ./utftest
gcc -c utftest.adb
gnatbind -x utftest.ali
gnatlink utftest.ali
Codepoint: 16#1D11E#
 UTF-8: 4: 2#11110000# 2#10011101# 2#10000100# 2#10011110#
UTF-16: 2: 2#1101100000110100# 2#1101110100011110#
 UTF-8: 4: 2#11110000# 2#10011101# 2#10010000# 2#10011110#
 UTF-8: 4: 2#11110000# 2#10011101# 2#10000100# 2#10011110#
OK

Reply via email to