Hi,

I have a web site where the most visited page "type" is generated  
from a xml file and a xslt transformation.

I'm looking for the best performances.

The code I use is the following :


                 Document doc = null;
                 SAXReader reader = new SAXReader(false);
                 try {
                     doc = reader.read(url);
                 }
                 catch (Exception e) {
                     doc = null;
                 }

                 if (doc != null) {
                     JspWriter out = pageContext.getOut();

                     Transformer transformer = getXSLTTransform("/ 
scheda.xslt");


                     DocumentSource source = new DocumentSource(doc);
                     DocumentResult result = new DocumentResult();
                     transformer.transform( source, result );
                     // return the transformed document
                     Document transformedDoc = result.getDocument();
                     HTMLWriter html = new HTMLWriter(out);
                     html.setEscapeText(false);
                     html.write(transformedDoc);
                 }



Now i'm caching the Transformer object.
I'd like to know if I can and if it makes sense in terms of  
performance and reducing memory garbage cache other objects such as  
the SAXReader.
I'm also interested into understand if this code is the best to  
minimize memory garbage production.

Thank you


--
Ing. Andrea Vettori
Consulente per l'Information Technology



-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
dom4j-user mailing list
dom4j-user@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/dom4j-user

Reply via email to