You may not have to rewrite your xslt. Part of the idea is to reduce the size of the document by eliminating unnecessary stuff in a transformer before your XSLT is invoked.

Ralph

Boisvert, Éric wrote:

thanks, I saw that.  I wondered if there was some obvious thing I could
check before starting rewriting the xslt (I know, I'm lazy)


Eric


-----Message d'origine-----
De : Ralph Goers [mailto:[EMAIL PROTECTED]
Envoyé : 28 octobre, 2005 11:51
À : users@cocoon.apache.org
Objet : Re: processing large files


Boisvert, Éric wrote:

Hi all

I need to process large xml file and as I tested with increasingly larger
file, the time to process suddently increased a lot.  For instance, 200 K
files took 0.8 seconds, 400 K file 2.5 sec and when I get near 1 Meg, it
jumps to 30 seconds (nearly 10 times, for twice the size).. I played with
the pipeline caching, outputBufferSize, etc.. even boosted CATALINA_OPTS to
512 Megs, nothing helped.  I guess this is related to the fact that at some
point the incoming document cannot be loaded entirely in memory.

Anyone has an idea to fix this ?

Cheers and thanks



This was the subject of one of the presentations at the Cocoon GetTogether (http://www.cocoongt.org). Here is a link to the presentation. http://cocoongt.hippo12.castaserver.com/cocoongt/nico-verwer-performance.pdf

Ralph


---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]

---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]



---------------------------------------------------------------------
To unsubscribe, e-mail: [EMAIL PROTECTED]
For additional commands, e-mail: [EMAIL PROTECTED]