Hi. I am using Bindy in order to marshal a large set of data (in batches) to a CSV file.
The problem that I am seeing is that Bindy does not seem to offer any streaming solution. This means it is expecting a single message (with the CSV content) to be loaded in memory in order to start writing to a file. I cannot write in batches (meaning my aggregator predicate expects the whole data to be complete) otherwise Bindy thinks I am sending multiple messages (basically outputs batch of files). Is there any way to do what I am trying to do without heavy loading the memory? I am thinking on something like "tokenizer().streaming()" solution. While searching on this forum, I found this post: http://camel.465427.n5.nabble.com/Bindy-Streaming-Multile-Rows-td5744413.html I think it touches on the same thing I am asking here, but since it is a post from 2013, I wonder if someone has found a workaround? Thanks in advance! IAM -- View this message in context: http://camel.465427.n5.nabble.com/Bindy-marshalling-large-CSV-tp5766124.html Sent from the Camel - Users mailing list archive at Nabble.com.