Right, so a buffer size of 300k seems like it's probably just too big.
You're trying to hold all that in memory which may be too much - depending
on the number of fields/columns in your data set. Now that you've
eliminated the empty catch block, I'd try going back down to a few thousand
records at
Ok. the current code is throwing OutOfMemoryException after 300K rows.
Stack trace :
java.lang.OutOfMemoryError: GC overhead limit exceeded
at java.util.Arrays.copyOf(Arrays.java:3332)
at
java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124)
at java.lang.
That code example doesn't look wrong to me, except I would never just catch
RuntimeException and then do nothing about it. Actually I wouldn't want to
try-catch anything here... Let it crash if something is truly wrong. You
may be hiding an important error message?
You could try using different bu
Hello Everyone,
I have created a sample which insert data from csv to SQL table, I used
RowInsertBuilder with BatchUpdateScript, it is working fine but takes too
much time to complete the operation because we have millions of rows in
csv.
I would need a better way to speedup the process, please l