I have added it to manage an pool of stream, it works fine. public class AEDFileWriter {
private Map<String,FileOutputStream> fosPool= new HashMap(); public void writeLine(@Body String data, @Property("OutputFileName") String OutputFileName, @Header("CamelSplitComplete") boolean done) throws IOException { // Get the outputStream FileOutputStream fos = fosPool.get(OutputFileName); // Create it if not exist if (fos == null) { fos = new FileOutputStream(new File(OutputFileName)); fosPool.put(OutputFileName, fos); } // Add data fos.write(data.getBytes()); // Eventually close the pool if (done) { fos.close(); fos = null; fosPool.remove(OutputFileName); } } } My POC is finish, camel can process ( on my laptop ) 1200 line of CSV per second and this number is growing via threading. I will put on gitHub the source of this project I think it could be help full for lot of camel beginner who need to deal with giga bytes of file ! Thanks JF 2013/3/8 Jean Francois LE BESCONT <jflebesc...@gmail.com> > It works fine with the simple example that is present but in a seda queue > executed with concurrentConsumers the bean throw an exception due to > concurrency ... > > > 2013/3/8 Jean Francois LE BESCONT <jflebesc...@gmail.com> > >> OK ! >> >> I have create a jira : https://issues.apache.org/jira/browse/CAMEL-6147 ( my >> first :) >> >> >> By the way the solution is a route like this : >> >> from("file://C:/Temp/camel/input_test/?noop=true") >> .setProperty("OutputFileName" , >> simple("C:/Temp/camel/output_test/${headers.CamelFileName}")) >> .split() >> .tokenize("\n") >> .streaming() >> .bean(MyFileWriter.class, "writeLine") >> .end() >> .end(); >> >> With a MyFileWriter like this : >> >> public class AEDFileWriter { >> >> private FileOutputStream fos; >> >> public void writeLine(String data, @Property("OutputFileName") String >> OutputFileName, @Header("CamelSplitComplete") boolean done) throws >> IOException { >> >> if (fos == null) { >> fos = new FileOutputStream(new File(OutputFileName)); >> } >> fos.write(data.getBytes()); >> if (done) { >> fos.close(); >> fos = null; >> } >> >> } >> } >> >> >> I am closed to the end :) >> >> >> 2013/3/8 Claus Ibsen <claus.ib...@gmail.com> >> >>> On Fri, Mar 8, 2013 at 2:39 PM, jeff <jflebesc...@gmail.com> wrote: >>> > the more performante way looks to do : >>> > >>> > from("file://C:/Temp/camel/input_test/?noop=true") >>> > .split() >>> > .tokenize("\n") >>> > // Business lock with possible reject / enrich >>> etc ... >>> > .streaming() >>> > >>> > .to("stream:file?fileName=C:/Temp/camel/output_test/out.csv") >>> > .end() >>> > .end(); >>> > >>> > But the stream is not closed ... Please help, I have to finish my >>> proof of >>> > concept ! :) >>> > >>> >>> We should improve the stream:file to be able to auto close when it >>> detects the splitter is done, as we have a completed property on the >>> exchange to tell us. >>> >>> Fell free to log a JIRA ticket, then we can add an option on stream:file >>> >>> "stream:file?fileName=C:/Temp/camel/output_test/out.csv&closeOnDone=true" >>> >>> Just have to figure out a good name for the option. >>> >>> >>> > Thanks >>> > >>> > JF >>> > >>> > >>> > >>> > >>> > >>> > -- >>> > View this message in context: >>> http://camel.465427.n5.nabble.com/Write-enriched-data-after-a-split-in-a-file-tp5728761p5728829.html >>> > Sent from the Camel - Users mailing list archive at Nabble.com. >>> >>> >>> >>> -- >>> Claus Ibsen >>> ----------------- >>> Red Hat, Inc. >>> FuseSource is now part of Red Hat >>> Email: cib...@redhat.com >>> Web: http://fusesource.com >>> Twitter: davsclaus >>> Blog: http://davsclaus.com >>> Author of Camel in Action: http://www.manning.com/ibsen >>> >> >> >