> First convert the char(1) in your charset into unicode (cause > everything in FW work with unicode), then use simple conversion: > char c = 'a'; > byte b = (byte)c;
You can do char c = 'a' because 'a' is a Char. You can't do char c = (byte)'\xA0'; // a byte with the value of hexadecimal A0, not letter 'a' in ASCII. you need some decoding. If I do, byte b = (byte)'\xA0'; byte[] bytearray = { b }; char [] chararray = Encoding.UTF8.GetChars(bytearray); int charCount = Encoding.UTF8.GetCharCount(bytearray); then charCount = =0 // decoding fails if I do byte b = (byte)'\xA0'; byte zero = 0; byte[] bytearray = { b, zero }; char [] chararray = Encoding.UTF8.GetChars(bytearray); int charCount = Encoding.UTF8.GetCharCount(bytearray); then charCount is 1. // decoding is successful. if byte b = (byte)'\x10'; // below 128, so ASCII range byte[] bytearray = { b }; char [] chararray = Encoding.UTF8.GetChars(bytearray); int charCount = Encoding.UTF8.GetCharCount(bytearray); then charCount is 1. // decoding is successful. Now, you see the problem. Regards, Nobuya ------------------------------------------------------------------------- This SF.net email is sponsored by DB2 Express Download DB2 Express C - the FREE version of DB2 express and take control of your XML. No limits. Just data. Click to get it now. http://sourceforge.net/powerbar/db2/ _______________________________________________ Firebird-net-provider mailing list Firebird-net-provider@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/firebird-net-provider