I'm using fop-0.20.1. I started with a 650KB XML file that I transformed into a 4MB XSL:FO file. Running this file through FOP to generate a PDF used about 90MB of memory.
Initial heap size: 807Kb Current heap size: 91637Kb Total memory used: 90829Kb Memory use is indicative; no GC was performed These figures should not be used comparatively Total time used: 31265ms Pages rendererd: 17 Avg render time: 1839ms/page I have XML files in excess of 15MB that need to be converted to PDF. Assuming that a linear extrapolation is possible, it would suggest that the JVM running the FOP process would need in excess of 2GB of memory for this to avoid the dreaded java.lang.OutOfMemoryError. Are there any optimizations that can be done to FOP? Thanks. -Steve Maring Nielsen Media Research --------------------------------------------------------------------- To unsubscribe, e-mail: [EMAIL PROTECTED] For additional commands, email: [EMAIL PROTECTED]