I have finished (for now anyway) my Xalan performance analysis and 
have determined that the DTM code is responsible for making Xalan 
unusable in our production environment.    I tracked down several data 
structures in the DTM classes, some of which I reported in an earlier 
email, that are initialized to large sizes and are duplicated many times 
during a transformation.  The memory consumption is so excessive that 
when running concurrent transformations it completely overwhelms the 
machine.

  Has there been any analysis that shows a big improvement with the DTM 
model, or does it still need to be tweeked to get it working 
efficiently?  I hope someone has done, or plans to do, some evaluation 
of the code to make sure things are improving for someone.   Its 
unfortunate that version 2.1.0 is the end of the line for us after using 
Xalan very successfully in production for over a year now.  Now I have 
to find a suitable replacement.

Chris


Reply via email to