Title: FOP performance - frustrating.. help!!
Thanks for your reply... given XML and XSL, how would I generate FO so I can post it ....
Thanks
-----Original Message-----
From: Rhett Aultman [mailto:[EMAIL PROTECTED]]
Sent: Friday, December 27, 2002 3:35 PM
To: [EMAIL PROTECTED]
Subject: RE: FOP performance - frustrating.. help!!

Really, what rows look like in XML is not helpful to us.  What is helpful is seeing the FO, since FOP tranforms from the FO and not really from the XML.  You may be using FO markup that's making the transformation and rendering costly, but we wouldn't be able to tell you for sure without seeing the FO.
-----Original Message-----
From: Lee, Insoo [mailto:[EMAIL PROTECTED]]
Sent: Friday, December 27, 2002 3:34 PM
To: '[EMAIL PROTECTED]'
Subject: FOP performance - frustrating.. help!!



>   Hello, I'm getting a little frustrated with FOP performance.
>   We are generating an XML on the fly that contains a few tables with
> about 1,000 rows in total.
>   We passdown this XML along with XSL to the transformer to generate PDF
> from our servlet.
>   (transformer.transform( inXML, new SAXResult(
> driver.getContentHandler()));  )
>   It works well, but it works slow.
>   It takes about 5 minutes for 1,000 rows and this is only for transformer
> translation (excluing any data query time or jdom XML building time)
>   I read through past emails/FAQs and I think I tried them all and it
> still does not improve the performance...
>
>   Please help!!
>  
>   Here is what I have done.
>
>   1) run with more memory - yes I tried with 512MB (I don't get
> OutOfMemory error)
>   2) run in separate VM - only helps other requests coming to the site
>   3) try with the latest FOP - I did - mine is fop-0.20.4
>   4) try with the latest xalan.jar and xerces.jar - tried them, but no
> improvement
>   5) cache XSL style sheet - didn't really help...
>   6) use multiple page-sequence - got it to work and I don't get
> OutOfMemeory exception, but still performance didn't get better.
>          (even if I break them into multiple page-sequences, I will still
> have to read through one row at a time, right? - that's why performance is
> not really improving..)
>
>   7) no forward-reference - I don't have any page numbers
>   8) try with new JDK - trying with JDK 1.3
>   9) seralize servlet request - haven't done this, but I doubt this will
> have any performance impact (I'm testing with one request for now)
>   10) no images - I don't have any...
>   11) Don't make the XML tree too deep - mine is very flat with 2 levels
> deep.
>
>   I have 1,000 rows and each row looks something like following in XML
>
>   <ROW>
>         <TA_FUND_CODE>539</TA_FUND_CODE>
>         <LONG_NAME>Some Fund</LONG_NAME>
>         <DAILY_FACTOR>0.00003760000</DAILY_FACTOR>
>         <SHORT_RATE_365>NA</SHORT_RATE_365>
>         <LONG_RATE_365>NA</LONG_RATE_365>
>         <RATE_360>NA</RATE_360>
>         <CURR_7_DAY_YIELD>NA</CURR_7_DAY_YIELD>
>         <SEVEN_DAY_EFF_YIELD>1.400000000</SEVEN_DAY_EFF_YIELD>
>         <AVG_30_DAY_RATE>1.580000000</AVG_30_DAY_RATE>
>         <PRICE>1.0000</PRICE>
>         <RATE_DATE>09 Dec 2002</RATE_DATE>
>         <CURRENCY>USD</CURRENCY>
>         <STATUS>I</STATUS>
>   </ROW>
>  
>
>   Here is the question:
>     Is this normal to take this long - 5 minutes - to produce a few tables
> with combined total 1,000 rows (about 30 pages)?
>
>     Any other better way?
>
>   Thank you for your suggestions.
>
>   Regards,
>   IL

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to