When there is no clue on what the content encoding of a form is and no
default value, most framework end-up using the platform default 
encoding (for debian system, it seems to be iso-8859-1). The problem is
most browser do not send the content encoding along with the form, so
the solution is generally to set the default encoding of your framework
to UTF-8 (assuming the framework can do it). Another solution is to use
a request filter that will do a 'request.setCharacterEncoding("UTF-8");'

Morten Andersen a écrit :

> On my site the users edit pages using a multipart form. There are
> differences between the way the content is being decoded on the server
> depending on the OS. (My guess). The uploaded content is stored in XML
> files and must then later be displayed to the user. I have spent the
> weekend trying to get this to work on the Debian based production server.
>
> How does the OS effect the decoding / accepting of the submitted forms?
>
> I have the following servers:
>
> Tomcat 5.0.28
> Development server: Windows
> Old server: FreeBSD
> New server: Debian (Can't accept UTF-8 form submits)
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, e-mail: [EMAIL PROTECTED]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

Reply via email to