Just to "one up" you.  :)

We have several docs that are > 20M source.  If we run them as a single 
page sequence, they run out of mem.  If we break them up into multiple 
page sequences, they _usually_ work.

So, as Jean-François suggests, using multiple page-sequences is a good 
start.

-Lou




Jean-François El Fouly <[EMAIL PROTECTED]> 
10/16/2008 08:23 AM
Please respond to
fop-users@xmlgraphics.apache.org


To
fop-users@xmlgraphics.apache.org
cc

Subject
Re: Memory issues






Richard Forrester a écrit :
>
> Hello,
>
> I have FOP 0.94 and I am running into some issues with Memory. I have 
> a rather large XML file.. 2.5 MB. when I try to create a PDF from this 
> file my memory usage spikes up to 700 MB when it starts converting. 
> However it seems to stay there between 600 and 700 MB almost like it's 
> frozen. I've waited 5 to10 minutes to see if it would finish, but 
> never seems to. Is there something I can do to fix this memory usage 
> issue and make it more manageable?
>
> Thank You!
>
Not willing to play "Mine is bigger than yours" ;-)
but the document I'm working on is 3.5 Mb source, 7.5 Mb FO and has 1200 
rather large PNG screenshots inside. The whole thing fits easily with a 
full AS and a rather large management web app in 1 Gb. And all the 
memory we need is released at the end.
So my best guess is: you should try to make your document more 
manageable for FOP by breaking it in severa fo:page-sequence (chapters, 
sections, whatever makes sense in your business). Between 
page-sequences, many resources are released.
Anyhow, it helped us make it (we had such problems in the beginning).

Jean-François El Fouly


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]


Reply via email to