I have a very large flat file, say 50-100 GBs (ex daily transactions).

I'm looking at the possibility of using camel to process the flat file and
update a database
Camel file and stream components come into place.

My gut reaction is to have a route that simply read 500MB to 1GB worth of
data and write to a file in a folder. (Break down the into manageable
chunks)

Another route looks at the folder and picks a file and hands it over to a
thread pool backed route to process them.

Has anybody attempted this kind of scenario?
If so have you run into challenges ?

Are there any limitations with streaming or file component? and which
component is better placed for this task?

Regards,

Rajith Muditha Attapattu <http://rajith.2rlabs.com/>

Reply via email to