I found problem - I had to compile my program with -fshort-chart flag

Alexey Kakunin wrote:
After a little bit more playing with strings I found more information - seems problem in UTF16 string created by NS_LITERAL_STRING - here is a code:

nsString  test = NS_LITERAL_STRING("<xml>");
const nsAString::char_type* pointer = test.get();
nsAString::char_type firstChar = *pointer;
pointer++;
nsAString::char_type secondChar = *pointer;

The thing is that firstChar in this case has code 60 (it is 'x') - it is ok, but second char is 0!

OK, maybe now somebody will have any ideas why?


Alexey Kakunin wrote:

Hello all!

I have: Linux x86 with gcc 3.4.4
And xpcom-standalone compiled form last cvs sources.

Then I'm trying to do simple convertion from UTF167 to UTF8 like:

nsString  test = NS_LITERAL_STRING("<xml>");
printf("unicode string size is: %d\n", test.Length());

nsCString strUTF8 = NS_ConvertUTF16toUTF8(test);
printf("utf8 string size is: %d\n", strUTF8.Length());
// it is really funnny! but seems under linux
// 2 bytes is used for CString!
printf("utf8 string is: %s\n", (const char*)strUTF8.get());


I got a string "<\0x\0m\0l\0>\0" - so, I seems have not a single-byte string in CString, but double-byte...

Same example works fine under Windows...

Do you have any ideas about that happens?

With best regards,
Alexey Kakunin
_______________________________________________
Mozilla-xpcom mailing list
[email protected]
http://mail.mozilla.org/listinfo/mozilla-xpcom

Reply via email to