You can't. Although it is true that memory is regained when a page is output, there is information that needs to be kept till the end, and that information keeps growing as you add pages. For instance, byte offsets for the objects need to be stored till the end to build the xref table, and right now all that is kept in memory.

You can get somewhat better results if you allow your page sequences to have more than one page (say, 100 or 1000). Putting all the pages in one page-sequence will not work either because you will run out of memory during the initial phase of building the FO tree.

On the other hand you can always increase the memory your jvm uses.

On 4/18/13 12:09 PM, aemitic wrote:
Hi,

I noticed that FOP (any version) uses a constantly growing amount of memory
when creating PDFs. Even if using very small <page-sequence> blocks and no
forward references.

Please refer to this stackoverflow question which contains a small java
program that replicates the issue.

How can I limit the amount of used memory when transforming to PDF an FO
file with a huge number of <page-sequence> blocks? (> 250000)

Thanks for any help



--
View this message in context: 
http://apache-fop.1065347.n5.nabble.com/FOP-memory-growing-with-a-lot-of-page-sequences-tp38355.html
Sent from the FOP - Users mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: fop-users-unsubscr...@xmlgraphics.apache.org
For additional commands, e-mail: fop-users-h...@xmlgraphics.apache.org

Reply via email to