Indeed, thanks to a recommendation from Mr. Lillywhite  to use a 
sequence for each of my pages instead of one sequence for all, my 
memory footprint for a 500 page report went from 270+ MB to 1.4 
MB. This was using 0.20.2

dave



On Friday, November 9, 2001, at 12:31 PM, Lloyd 
McKenzie/CanWest/IBM wrote:

>
> I'm not sure if this will help or not, but it worked well for me.
>
> I was trying to process a 64 MB document, and it was taking DAYS 
> and was
> eating gobs of memory.  I did some wading through the code, looking for
> ways to optimize.  I found a couple of places to reduce memory, 
> but nothing
> substantial.  (I plan to run some analysis on my changes, and if 
> they make
> a difference of more than 5%, I'll submit them for inclusion in a 
> future
> release.)  However, in my wondering through the code, I realized 
> that FOP
> parses and stores everything until it runs into an 'End' Page sequence
> marker.  My XML document was one BIG page sequence, so FOP was 
> parsing the
> entire thing before it would start to generate output.  As my XML 
> consisted
> of a large number of fairly independent sections, I modified my 
> XSLT to put
> each section into a different page sequence.  The result was that 
> FOP only
> parses objects to the end of the page-sequence, spits out the pages for
> that sequence, and garbage collects the objects before moving 
> on.  The only
> data that is retained are link references.  These eat up a bit 
> memory, but
> nothing as bad as all of the area references needed to draw the page :>
>
> Hope  this helps,
>
>
> Lloyd
>
> Lloyd McKenzie, P.Eng.              I/T Architect, IBM Global Services
> Internet: [EMAIL PROTECTED]
> PhoneMail: (780)421-5620          Internal Mail:AZ*K0R*1004 *EDM
>
>
> Matt Savino <[EMAIL PROTECTED]> on 11/09/2001 08:21:53 AM
>
> Please respond to [EMAIL PROTECTED]
>
> To:   [EMAIL PROTECTED]
> cc:
> Subject:  Re: FOP memory usage
>
>
> Make sure you're using -hotspot. Try setting the initial and max heap
> size to 256M if you have it. Turn on verbose garbage collection to see
> what's happening. Even though it says 'No garbage collection was
> performed, I'm not sure that's accurate (see below). Also sometimes the
> total memory used is negative. So don't assume you'll always run out of
> memory. That said 15MB XML to 120MB PDF may be a littl much. The only
> way to find out is to try it!
>
> <<< my output on Weblogic with -hotspot -verbose:gc -ms256m -mx256m >>>
>
> FopServlet says hi
> [GC 14710K->12798K(261888K), 0.0258153 secs]
> [GC 14840K->13743K(261888K), 0.0275211 secs]
> [Full GC 15436K->13778K(261888K), 0.7851467 secs]
> [GC 15825K->14079K(261888K), 0.0097378 secs]
> [GC 16127K->14306K(261888K), 0.0203590 secs]
> [GC 16354K->14835K(261888K), 0.0211491 secs]
> [GC 16883K->14911K(261888K), 0.0125452 secs]
> [GC 16959K->14949K(261888K), 0.0097037 secs]
> [GC 16997K->14981K(261888K), 0.0080228 secs]
> building formatting object tree
> setting up fonts
> [GC 17029K->15288K(261888K), 0.0154997 secs]
> [GC 17336K->15777K(261888K), 0.0254016 secs]
> [GC 17825K->16324K(261888K), 0.0199059 secs]
>  [1[GC 18372K->16920K(261888K), 0.0248386 secs]
> [GC 18968K->17332K(261888K), 0.0178556 secs]
> [GC 19380K->17702K(261888K), 0.0221106 secs]
> ] [2][GC 19750K->18117K(261888K), 0.0219930 secs]
> [GC 19021K->18525K(261888K), 0.0153204 secs]
> [GC 19952K->19940K(261888K), 0.0163652 secs]
> [GC 21009K->21005K(261888K), 0.0129846 secs]
> [GC 22075K->22075K(261888K), 0.0132101 secs]
>  [3[GC 24122K->23293K(261888K), 0.0148726 secs]
> ][GC 25341K->23623K(261888K), 0.0144110 secs]
>  [4[GC 25671K->23925K(261888K), 0.0167574 secs]
> [GC 25973K->24281K(261888K), 0.0171810 secs]
> ]
> Parsing of document complete, stopping renderer
> Initial heap size: 15357Kb
> Current heap size: 24716Kb
> Total memory used: 9358Kb
>   Memory use is indicative; no GC was performed
>   These figures should not be used comparatively
> Total time used: 5117ms
> Pages rendererd: 4
> Avg render time: 1279ms/page
>
>
>
>
>
>
> "Maring, Steve" wrote:
>>
>> I'm using fop-0.20.1.
>>
>> I started with a 650KB XML file that I transformed into a 4MB XSL:FO
> file.
>> Running this file through FOP to generate a PDF used about 90MB of
> memory.
>>
>> Initial heap size: 807Kb
>> Current heap size: 91637Kb
>> Total memory used: 90829Kb
>>   Memory use is indicative; no GC was performed
>>   These figures should not be used comparatively
>> Total time used: 31265ms
>> Pages rendererd: 17
>> Avg render time: 1839ms/page
>>
>> I have XML files in excess of 15MB that need to be converted to PDF.
>> Assuming that a linear extrapolation is possible, it would 
>> suggest that
> the
>> JVM running the FOP process would need in excess of 2GB of memory for
> this
>> to avoid the dreaded java.lang.OutOfMemoryError.
>>
>> Are there any optimizations that can be done to FOP?
>>
>> Thanks.
>> -Steve Maring
>> Nielsen Media Research
>>
>> ---------------------------------------------------------------------
>> To unsubscribe, e-mail: [EMAIL PROTECTED]
>> For additional commands, email: [EMAIL PROTECTED]
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, email: [EMAIL PROTECTED]
>
>
>
>
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: [EMAIL PROTECTED]
> For additional commands, email: [EMAIL PROTECTED]
>


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, email: [EMAIL PROTECTED]

Reply via email to