Hi,

I'm using a BatchInserter and a LuceneIndexBatchInserter to insert >5m nodes 
and 
>5m relationships into a graph in one go. The insertion seems to work, but 
shutting down takes forever - it's been 2 hours now.

At first, the JVM gave me garbage collection exception, so I've set the heap to 
2gb.

'top' tells me that the application is still running:

  PID USER      PR  NI  VIRT  RES  SHR S %CPU %MEM    TIME+  COMMAND
 9994 tim        17   0 2620m 2.3g 238m S 99.5 39.1 115:48.84 java

but checking the filesystem by running 'ls -l' a few times doesn't indicate 
that 
files are being updated.

Is this normal? Is there a way to improve performance?

I'm loading all my data in one go to ease creating the db - it's simpler to 
create it from scratch each time instead of updating an existing database - so 
ideally I don't want to break this job down into multiple smaller jobs 
(actually, this would be OK if performance was good, but I ran into problems 
inserting data and retrieving existing nodes).

Thanks,
Tim



      

_______________________________________________
Neo4j mailing list
User@lists.neo4j.org
https://lists.neo4j.org/mailman/listinfo/user

Reply via email to