Hello All, I have a graph of about 200M relationships and often I need to delete a larges amount of them. For the proxy code below I am seeing huge memory usage and memory thrashing when deleting about 15M relationships.
When it hits tx.close() I see all CPU cores start working at close to 100% util and thrash for > 30mins. I need this to work in <5mins ideally. (note when I execute large amounts of changes to properties or create large amounts of new properties I don't have such issues) Any advice? Why is this happening? Regards, John. int txc = 0; // serially delete the links try ( Transaction tx = db.beginTx() ) { for (int i=0; i<deletedLinks.size(); i++) { Relationship rel = db.getRelationshipById(deletedLinks.get(i)); rel.delete(); txc++; if (txc>50000) { txc=0; tx.success(); } } tx.success(); tx.close(); } catch (Exception e) { System.out.println("Exception link deletion: " + e .getMessage()); } -- You received this message because you are subscribed to the Google Groups "Neo4j" group. To unsubscribe from this group and stop receiving emails from it, send an email to neo4j+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.