Hi everybody,
 
in my ExecuteRuleEngine processor I have used a LineIteraotr to loop ofer 
incomming rows of data (CSV).
 
I was wondering if that is the preferred method for (large) files or if there 
are better ways to do it. Also, is there a way to influence the number of 
buffered rows or bytes that the LineIterator uses?
 
I saw that if I use a large file (about 2 mio rows) with my processor, then the 
flow stops and aborts at a certain point. Probably a memory issue. I do not 
have a lot of ram available. On the other hand, I can process the same large 
file, if I use a SplitText processor before my processor and split the file 
into chunks of e.g. 200000 rows.
 
Any hints or recommendations are welcome.
 
Greetings,
 
Uwe

Reply via email to