Hi all!

I have a requirement to transform a fairly large set of xml data that I
fetch over HTTP and then FTP it out.
After getting my first OOME I started reading up a bit on how you could
utilize the splitter with streaming and tokenizing pairs (2.9RC) which
worked well for me for splitting and transforming the individual pieces.
(Great work btw on this!)

However, I have not found a good way to aggregate the data back again after
the transformation. My custom aggregationstrategy that I set on the splitter
simply concatinates the payloads into either a String or byte[], but this
also gives me memory problems.

Now, the streamCaching on the route coupled with a streaming splitter solves
the splitting part, but can anyone hint me in the right direction of how I
should effectively piece the parts back during the aggregation phase?
Basically offloading the storage of the aggregated data from memory..

I have played around with the concept of simply appending all
split/transformed elements to a file, and ftp that out after it has been
marked complete. This works, but then I loose all my state inside the route
which I need for later reporting.

--
View this message in context: 
http://camel.465427.n5.nabble.com/Need-tips-for-splitter-aggregator-with-large-data-tp5020356p5020356.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Reply via email to