Thanks for the answer. I can't remember exactly which inserter I use currently. I will check it tomorrow. But I do commit and restart a new transaction after a fixed number of insertions.
At the same time, I execute some queries on the database while I insert the data, since such insertions are conditional and always require some checks before the insertions are executed. cheers, Qiuyan Quoting Martin Neumann <m.neumann.1...@gmail.com>: > Do you use the Batchinserter or a normal Transaction? > When using a normal Transaction to insert huge amounts of data I always > submit and create a new transaction every X Items. This keeps the > transaction small and reduces the memory used. > > cheers Martin > > On Sun, Jul 4, 2010 at 4:13 PM, <qiuyan...@mailbox.tu-berlin.de> wrote: > >> Hallo, >> >> I'm currently working with neo4j database and want to insert a bunch >> of data into it. >> >> At the very beginning the program works quite well. But as more data >> has been inserted into the database, the insertion runs more and more >> slowly and I noticed that the program consumes really a lot of memory. >> Even though I splitted the input file into small pieces so that each >> time the program tries only to insert a small part of data, the >> problem occurs. That means, as there exists already much data in the >> database, the program consumes a lot of memory as soon as it begins so >> that the insertion is so slow that it seems that it won't be able to >> finish. >> >> I wonder if there's some solutions to save the memory. Thanks in advance. >> >> Cheers, >> Qiuyan >> >> _______________________________________________ >> Neo4j mailing list >> User@lists.neo4j.org >> https://lists.neo4j.org/mailman/listinfo/user >> > _______________________________________________ > Neo4j mailing list > User@lists.neo4j.org > https://lists.neo4j.org/mailman/listinfo/user > > _______________________________________________ Neo4j mailing list User@lists.neo4j.org https://lists.neo4j.org/mailman/listinfo/user