Hi All,

 

I am trying to understand why it is so inefficient to process large SOAP envelopes and the need to use attachments?

 

After reading a couple papers on the subject I have come to the understanding that the problem resides in parsing such large XML data structures. It seems strange that parsers cannot handle large data structures in an efficient manner and would like to understand why? Is the problem associated with large element data strings or huge trees?

 

I have build a web app that provides a deployment platform serving remote clients. Clients build interfaces to this service and use SOAP as the communication protocol. The SOAP envelop contains a Base64 encoded .war file. Clients send this data to the Server and the Server also serves this data to other remote clients wishing to deploy the app. In testing using a 5M .war file in the SOAP envelope with the client and server on the same machine the process consumes 135M of ram, using attachments it only consumes 20M ram.

 

It is easy to say that I should just use attachments, but this is compromising a really clean interface, I am now being forced to dissect my data and split it up to avoid being a memory hog adding what seems to be unnecessary complexity and I am out to understand why and if there is any parsers that could handle this?

 

Thanks for your time,

 

Greg Hess

Software Engineer

Wrapped Apps Corporation

275 Michael Cowpland Dr.

Suite 201

Ottawa, Ontario

K2M 2G2

Tel: (613) 591 -7552

Fax: (613) 591-0523

1 (877) 388-6742

www.wrappedapps.com

 

<<image001.gif>>

Reply via email to