hernan gonzalez writes:
> BTW, I understand that postgresql uses locale semantics in the server code.
> But is this really necessary/appropiate in the client (psql) side?
> Couldnt we stick with C locale here?
As far as that goes, I think we have to turn on that machinery in order
to have gettext
Wow, you are right, this is bizarre...
And it's not that glibc intends to compute the length in unicode chars,
it actually counts bytes (c plain chars) -as it should- for computing
field widths...
But, for some strange reason, when there is some width calculation involved
it tries so parse the cha
hernan gonzalez writes:
> Sorry about a error in my previous example (mixed width and precision).
> But the conclusion is the same - it works on bytes:
This example works like that because it's running in C locale always.
Try something like this:
#include
#include
int main () {
char s[]
Sorry about a error in my previous example (mixed width and precision).
But the conclusion is the same - it works on bytes:
#include
main () {
char s[] = "ni\xc3\xb1o"; /* 5 bytes , 4 utf8 chars */
printf("|%*s|\n",6,s); /* this should pad a black */
printf("|%.*s|\n",4,s);
However, it appears that glibc's printf
code interprets the parameter as the number of *characters* to print,
and to determine what's a character it assumes the string is in the
environment LC_CTYPE's encoding.
Well, I myself have problems to believe that :-)
This would be nasty... Are you sure?
hernan gonzalez writes:
> The issue is that psql tries (apparently) to convert to UTF8
> (even when he plans to output the raw text -LATIN9 in this case)
> just for computing the lenght of the field, to build the table.
> And because for this computation he (apparently) rely on the string
> routin
Mmm no: \x displays correctly for me because it sends
the raw text (in LATIN9) and I have set my terminal in LATIN9 (or ISO-8859-15)
And it's not that "xterm is misdisplaying" the text, it just that psql
is ouputting
an EMPTY (zero lenght) string for that field.
(I can even send the output to a
hernan gonzalez writes:
> But I actually dont want that. I want psql to not try any charset
> conversion, just give me the raw text as is stored in the db.
Well, that's what it's doing (given the default setting with
client_encoding equal to server_encoding), and then xterm is
misdisplaying the t
It's surely not a xterm problem, I see the characters ok with just the
\x formatting. I can check also the output redirecting to a file.
My original client_encoding seems to be LATIN9 in both cases,
accorging to the \set ouput.
If I change it (for the root user) to UTF8 with " SET CLIENT_ENCODING
hernan gonzalez writes:
> My scenario: Fedora 12, Postgresql 8.4.3 compiled from source.
> Database encoding (global) LATIN9.
> User postgres locale: LANG=en_US.iso885915,
> User root locale LANG=en_US.UTF-8
> When I connect from postgres user, all is right.
> When I connect from root, it's not
(Disclaimer: I've been using Postgresql for quite a long time, I
usually deal with non-ascii LATIN-9 characters ,
but that has never been a problem, until now)
My issue summarized: when psql is invoked from a user who has a locale
different from that of the database, the tabular output
is wrong fo
11 matches
Mail list logo