Hello all,

I'm running into some issues and was wondering if anyone on the list had any suggestions.

In my application, I have a wrapper that reads XML from a socket, does processing and writes the
result XML to the same socket. (Standard stuff.) My input messages are pretty small, less than 1K, 90% of the
time, on the other hand, the output messages can get large, a 100K+ is not uncommon.
I build up the representation of the output document in DOM and use the XMLSerializer (DOMSerializer)
to give me a string representation. Until I do that, my program is reasonably light on memory resources.
When serializing, in order to not get java.lang.OutOfMemory exceptions, I have to up the max heap size to
~30,000,000  (From the default of 16,777,216) when running the program, the result is a 11K XML file.That
seems like a lot of needed heap space to serialize a small document... I should also
mention that if the generated tree has a serialized size of less than 1K, there is no memory problem. (I haven't
been able to find the breaking point yet.)

Now, i thought that maybe feeding the processor a Reader and the serializer a Writer (right from the socket),
I may be able to alleviate some of the memory fragmentation and overhead. Unfortunately, it doesn't seem
that InputSource is socket 'friendly'. It seems to hang waiting for input to complete. (Or maybe the socket to go
away.) Anyone have experiences with this?

Is there a better way of serializing the document?

I figured I'd ask to see what others experiences have been with this sort of thing... Any help would be greatly
appreciated.

Thanks,
-Pete

Serialization:
        stringOut = new StringWriter();
        OutputFormat format = new OutputFormat(m_ResultDocument);
        format.setIndenting(true); // Temporary for readability
        XMLSerializer serial = new XMLSerializer(stringOut, format);
        serial.asDOMSerializer();
        serial.serialize(m_ResultDocument.getDocumentElement());
        String sResult =  stringOut.toString();

Reply via email to