Howcome it is getting OOM on import? It shouldn't do that should it? I've imported several GB of data before, I did get OOM, but increasing heap size to around 1GB worked for me. I didn't need to go to crazy sizes...

On 10/09/2013 4:32 PM, Noel Grandin wrote:

On 2013-09-06 20:15, Brian Craft wrote:
I need to load about 1G of data into an existing db, while maintaining data coherence. Wrapping the inserts in one transaction results in out-of-memory problems in the jvm. I increased the max heap size to 8g w/o improvement. I can split it into a bunch of smaller commits, which works fine, but then on error I need a bunch of application code to delete the transactions which succeeded. The deletes will need their own transactions, which could also fail.

Is there any better way to do this?


Not really.
One strategy would be to copy the DB, since with H2 it's just a single file, and then run your import process.
If it fails, just replace the modified DB with the backup.



--
You received this message because you are subscribed to the Google Groups "H2 
Database" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to h2-database+unsubscr...@googlegroups.com.
To post to this group, send email to h2-database@googlegroups.com.
Visit this group at http://groups.google.com/group/h2-database.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to