It seems that FF3 will always attach charset=UTF-8 to post requests,
even if you define it otherwise.

I tested this on FF2 and changed the charset to what I was after
(windows-1253) but still, non-latin characters were not stored &
displayed correctly. I assume my problem is server-side rather than
client side, but the question remains: Why do non-latin characters
display correctly when doing a non-ajax, non-javascript POST, and why
do they change to unreadable characters when otherwise? (FF will
change them to UTF-8, whilst IE will change them to question marks)

Anyway, I wrote a small plugin to submit my forms in a hidden iframe
so that to avoid ajax. Problem solved. Works in all browsers,
including IE. Hope it helps: http://www.nicolas.rudas.info/jQuery/xajaSubmit.js

On Jan 24, 7:30 pm, Nicolas R <ruda...@googlemail.com> wrote:
> Hi Mike,
>
> Unfortunately no I can't post a link, the site is private and contains
> lots of sensitive user data.
>
> I observe request using FF's firebug. As I said, using GET will change
> the content-type to whatever I specify. When using POST, the content-
> type may change but the charset part will not.
>
> For example, if I specify
> xhr.setRequestHeader('Content-type','application/x-www-form-
> urlencoded;charset=windows-1253');
>
> firebug will show the content-type to be
>
> Content-Type : application/x-www-form-urlencoded; charset=UTF-8
>
> I tried using the form plugin and doing ajaxSubmit rather than $.ajax,
> but still the charset remains UTF-8.
>
> Is this normal JS behaviour?
>
> Thanks
>
> Nicolas
>
> On Jan 24, 3:14 am, Mike Alsup <mal...@gmail.com> wrote:
>
> > > > I need some help here guys.
> > > > I'm trying to modify the content-type and accept-charsetrequest
> > > > headers of an ajax call and it seems that beforeSend does not really
> > > > change the XHR object.
>
> > > > My code is something like this:
> > > >                 beforeSend : function(xhr) {
> > > >                         
> > > > xhr.setRequestHeader('Accept-Charset','windows-1253');
> > > >                         
> > > > xhr.setRequestHeader('Content-type','application/x-www-form-
> > > > urlencoded;charset=windows-1253')
> > > >                 }
>
> > > > I need the charset to be windows-1253 and not UTF-8, as the database
> > > > and everything in between (server side scripts) are encoded with
> > > > windows-1253.
>
> > > > My html page has the correct charset specified:
> > > >         <meta http-equiv="Content-Type" content="text/html;
> > > > charset=windows-1253" />
> > > >         <meta http-equiv="Content-Script-Type" content="text/javascript;
> > > > charset=windows-1253" />
>
> > > > If I submit the form without ajax, the charset is ok and my data is
> > > > saved correctly. Otherwise non-latin characters are replaced with
> > > > weird characters. From what I understand, changing the charset &
> > > > encoding to UTF-8 is currently not an option.
>
> > > > Any suggestions? Is this a jquery bug or I'm I doing something wrong?
> > > It seems that when I use GET instead of POST, the content type header
> > > is correctly changed to what I specify.
>
> > > But even so, this is not a stable fix as I want to POST.
>
> > > Any ideas?
>
> > Hi Nicolas,
>
> > Can you post a link to a page that shows this behavior?  How are you
> > observing the outgoing request headers?
>
> > Mike
>
>

Reply via email to