Hello,

We have a large file that we would like to transactionally read row by row
into a database using Bindy objects without blowing up the Java heap.

Several attempts has been made using nested Split EIP and SQL producer with
batch constructs. One of them we used a combination Split & Loop EIP with
temp files.

In that process we discovered that the Loop EIP doesn't copy the
"Transacted" state and then SQL producer will update the database
with AutoCommit event if the route is transacted.

First question is that intended and if so is it documented?

The second question is how out business case should be solved without a
global transaction?

Attached you can find a JUnit test with three test methods

   - *test00_PlainJdbcInsert* that show the intended solution with plain
   JDBC using PreparedStatement batches.
   - *test01_HappyPath* that successfully insert all rows from all batches
   into the database.
   - *test02_OnException* that throw an exception after a number of batches
   and was expected no records in the database.


Mvh / Best regards,

*Håkan Lantz*



+46 (0)736 840 870
hakan.la...@replyto.se
www.replyto.se

Reply via email to