On Mon, Aug 8, 2011 at 7:29 PM, jeevan.koteshwara
<jeevan.koteshw...@gmail.com> wrote:
> I am trying to split a large fixed length record file (say 350K records) into
> multiple files (each of 100k each). I thought of using
> from(src).split().method(MySplitBean.class).streaming.to(destination). But,
> this may give memoryproblems while processing large files (say 500K
> records). Since "MySplitBean" should return a List object (which may contain
> very huge data), I doubt is this a good approach.
>
> Is there any other methods available to split the input file?
>

You could in fact just use a regular java bean to do all the file
splitting manually.

Alternatively if you want to use the Camel splitter, then you can
return an iterator, that iterates a custom InputStream, by which you
read the source file in chunks, eg until you have read 50K lines (or
the end of the source file).

Then it would all be streaming based and you would not read the entire
file into memory.

But you would then have to fiddle a bit with low level code with a
custom iterator, and a custom InputStream.



> --
> View this message in context: 
> http://camel.465427.n5.nabble.com/Split-large-file-into-small-files-tp4678470p4678470.html
> Sent from the Camel - Users mailing list archive at Nabble.com.
>



-- 
Claus Ibsen
-----------------
FuseSource
Email: cib...@fusesource.com
Web: http://fusesource.com
Twitter: davsclaus, fusenews
Blog: http://davsclaus.blogspot.com/
Author of Camel in Action: http://www.manning.com/ibsen/

Reply via email to