Hello dev,

at present, I use Camel 2.2-fuse-02-00 (inside FUSE ESB 4.2-fuse-02-00) and
plan to upgrade to 2.3-fuse-01-00 if it's available.

I have a requirement to read big fixed length files (up to 500 MB), split
the content in individual messages, process/transform these messages into
xml, aggregate all transformed messages and than write it into a new file.

My questions are about the aggregator:
- If I use the 'streaming()' option in the splitter, is the
'CamelSplitComplete' exchange property the best strategy to detect in the
aggregator that all messages are processed? Requires this, that I can not
process/transform the individual messages in parallel? My understanding is
also, that I can not use the 'completionFromBatchConsumer' option in the
aggregator, because it use the 'CamelBatchSize' header which is not set in
streaming mode. Do you see other options?
- My understanding is, that the aggregation is done in memory and not made
for aggregating big messages (500 MB or so). Would it be useful/possible to
extend the aggregator to stream the content into a file (may be with a
header and a footer)? Or should this part of a custom aggregating strategy?
- Another solution could be to store all transformed messages into
individual files and after all messages are processed, the aggregation
strategy kicks in a custom processor which read all files in stream it into
a new (big) file.
- Do you see other solutions for this?

May be a good topic for the wiki, what I could write...

Regards,
Christian
-- 
View this message in context: 
http://camel.465427.n5.nabble.com/Splitting-big-files-and-aggregate-the-big-responses-tp858166p858166.html
Sent from the Camel Development mailing list archive at Nabble.com.

Reply via email to