That code example doesn't look wrong to me, except I would never just catch
RuntimeException and then do nothing about it. Actually I wouldn't want to
try-catch anything here... Let it crash if something is truly wrong. You
may be hiding an important error message?

You could try using different buffer sizes... In another open source
project that I work on that uses this (DataCleaner) we allow selecting
between something like 1k, 10k, 20k and 100k size buffers. I can't recall
if that buffer number is an actual record number - I think it may be a
"value count" so columns X rows.

Den ons. 17. okt. 2018 kl. 21.21 skrev Laxmi Lal Menaria <
menarialaxmi...@gmail.com>:

> Hello Everyone,
>
> I have created a sample which insert data from csv to SQL table, I used
> RowInsertBuilder with BatchUpdateScript, it is working fine but takes too
> much time to complete the operation because we have millions of rows in
> csv.
>
> I would need a better way to speedup the process, please let me know how
> can I improve it, i.e. close the PreparedStatement after 10k rows or
> something else, so few thousands rows  executed and it will free up the
> list.
>
> Current code block is:
>
> final UpdateableDataContext dc = con.getUpdateableDataContext();
>         dc.executeUpdate((BatchUpdateScript) callback -> {
>             for (final Object[] rowData : buffer) {
>                 RowInsertionBuilder insertBuilder =
> callback.insertInto(columns[0].getTable());
>                 for (int i = 0; i < columns.length; i++) {
>                     insertBuilder = insertBuilder.value(columns[i],
> rowData[i]);
>                 }
>                 try {
>                     insertBuilder.execute();
>
>                 } catch (final RuntimeException e) {
>
>                 }
>             }
> });
>
> --
>
> Thanks,
> Laxmilal Menaria | +91 982 955 3793 | http://laxmilalmenaria.com/
>

Reply via email to