Hi all. I am using fop-0.20.4 to create PDF and text files from XML files encoded in ISO-8859-1. While the PDF files are ok, the text files are always UTF-8 encoded.
By looking at the TXTRenderers sources I found the reason for this behaviour: The TXTRenderer uses the TXTStream class to write to an OutputStram and this TXTStream assumes a UTF-8 encoding: public void add(String str) { if (!doOutput) return; try { byte buff[] = str.getBytes("UTF-8"); out.write(buff); } catch (IOException e) { throw new RuntimeException(e.toString()); } } I don't want to simply change this to another fixed encoding, even though I always - at least thats what I know now - will use this encoding. My FO files always contain an encoding attribute in the XML declaration so I thought the ContentHandler might instruct the renderer which encoding to use but the SAXContentHandler does not get this information. I would like to fix this, but I am not sure how to do it. Is there any preferable way to tell the renderer which encoding to use? thanks Torsten -- _________________________________________________________ Torsten Straube * picturesafe media/data/bank GmbH Lüerstr. 3 * D-30175 Hannover * phone: 0511/85620-53 fax: 0511/85620-10 * mailto:[EMAIL PROTECTED] --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]