Hi Christian,
                to be give you better picture, my requirement goes like
this.

I need to transfer a fixed length record file to a destination Meanwhile, my
route is responsible for transforming into required format (say to CSV or to
XML).

Now, the input file may be too big. It may contain more records (say about
500k) in it. So, if I use split(body().tokenize("\n"), new
CustomAggregationStrategy()).streaming(), may cause delay and also it may
lead to out of memory exception while aggregating the messages.

So, I thought of using split().method(CustomBean.class).streaming(), where
my CustomBean return an Iterator (custom iterator, which will iterate
through the input message stream and will split theincoming message based on
line numbers) object. In this case, everything looks fine, but the end file
will be overided with the latest splitted message, instead of appending
every message.

Cluase suggested to use "fileExist=Append" option. But, as per my
requirement, after this split and transform process, need to do some more
actions on the route. E.g.

RouteDefinition routeDef =
from(src).split().method(CustomeBean.class).streaming();
routeDef = routeDef.bean(ActioneBean1()); //could be zipping action etc
routeDef = routeDef.bean(ActionBean());
routeDef.to(dest);

In this case, if I split messages and if I didnt aggregate them, then I am
affraid whether my action beans could perform correctly or not (I am not
certain on this).

So, I am verifying is there any ways to aggregate the split messages
(without using split(body().tokenize("\n"), new MyAggreagationStrategy()),
because this will cause out of memory error).


--
View this message in context: 
http://camel.465427.n5.nabble.com/Split-large-file-into-small-files-tp4678470p4697202.html
Sent from the Camel - Users mailing list archive at Nabble.com.

Reply via email to