Right, so a buffer size of 300k seems like it's probably just too big.
You're trying to hold all that in memory which may be too much - depending
on the number of fields/columns in your data set. Now that you've
eliminated the empty catch block, I'd try going back down to a few thousand
records at a time.

Den tir. 23. okt. 2018 kl. 21.38 skrev Laxmi Lal Menaria <
menarialaxmi...@gmail.com>:

> Ok. the current code is throwing OutOfMemoryException after 300K rows.
> Stack trace :
>
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>    at java.util.Arrays.copyOf(Arrays.java:3332)
>    at
> java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
>    at
> java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:649)
>    at java.lang.StringBuilder.append(StringBuilder.java:202)
>    at
> org.apache.metamodel.jdbc.JdbcInsertBuilder.createSqlStatement(JdbcInsertBuilder.java:117)
>    at
> org.apache.metamodel.jdbc.JdbcInsertBuilder.createSqlStatement(JdbcInsertBuilder.java:96)
>    at
> org.apache.metamodel.jdbc.JdbcInsertBuilder.execute(JdbcInsertBuilder.java:63)
>
>
> I have already used try-catch in for that block where required. Please have
> a look it once and let me know if there any option that we can close the
> connection with Preparedstatement after 100k rows with greater buffer size.
> This will improve the memory issue and will be much better for large data
> blocks.
>
> Thanks,
> Laxmi Lal Menaria
>
>
> On Thu, Oct 18, 2018 at 11:29 PM Kasper Sørensen <
> i.am.kasper.soren...@gmail.com> wrote:
>
> > That code example doesn't look wrong to me, except I would never just
> catch
> > RuntimeException and then do nothing about it. Actually I wouldn't want
> to
> > try-catch anything here... Let it crash if something is truly wrong. You
> > may be hiding an important error message?
> >
> > You could try using different buffer sizes... In another open source
> > project that I work on that uses this (DataCleaner) we allow selecting
> > between something like 1k, 10k, 20k and 100k size buffers. I can't recall
> > if that buffer number is an actual record number - I think it may be a
> > "value count" so columns X rows.
> >
> > Den ons. 17. okt. 2018 kl. 21.21 skrev Laxmi Lal Menaria <
> > menarialaxmi...@gmail.com>:
> >
> > > Hello Everyone,
> > >
> > > I have created a sample which insert data from csv to SQL table, I used
> > > RowInsertBuilder with BatchUpdateScript, it is working fine but takes
> too
> > > much time to complete the operation because we have millions of rows in
> > > csv.
> > >
> > > I would need a better way to speedup the process, please let me know
> how
> > > can I improve it, i.e. close the PreparedStatement after 10k rows or
> > > something else, so few thousands rows  executed and it will free up the
> > > list.
> > >
> > > Current code block is:
> > >
> > > final UpdateableDataContext dc = con.getUpdateableDataContext();
> > >         dc.executeUpdate((BatchUpdateScript) callback -> {
> > >             for (final Object[] rowData : buffer) {
> > >                 RowInsertionBuilder insertBuilder =
> > > callback.insertInto(columns[0].getTable());
> > >                 for (int i = 0; i < columns.length; i++) {
> > >                     insertBuilder = insertBuilder.value(columns[i],
> > > rowData[i]);
> > >                 }
> > >                 try {
> > >                     insertBuilder.execute();
> > >
> > >                 } catch (final RuntimeException e) {
> > >
> > >                 }
> > >             }
> > > });
> > >
> > > --
> > >
> > > Thanks,
> > > Laxmilal Menaria | +91 982 955 3793 | http://laxmilalmenaria.com/
> > >
> >
>
>
> --
>
> Thanks,
> Laxmilal Menaria | +91 982 955 3793 | http://laxmilalmenaria.com/
>

Reply via email to