We are trying to transform very large XML documents (30MB to 1GB in size) and
were planning on using an XSLT engine for it.  Our tests showed that passing in
the document in DOM typically crashed with out of memory errors.

however, even with passing in SAX events, the memory consumption was around
FIVE times the document size (e.g. the 50MB document input consumed 250MB of
the jvm).

would appreciate any inputs on any way to improve the memory consumption and
more generally - Is XSLT the way to go for such large documents? what are the
alternatives (btw, we tried asking customer to reduce or break up the document
- not feasible!)?

Thanks
Rajesh


__________________________________
Do you Yahoo!?
New Yahoo! Photos - easier uploading and sharing.
http://photos.yahoo.com/

Reply via email to