I believe the standard traverser keeps a cache of visited nodes (ids) to
prevent covering the same ground again, and that will grow the memory on
very deep traversals, but I would be surprised if it grew it as much as you
see. I think the neo4j developers will know better.
I would probably still g
Maybe it's the reason. Does query consume a lot of memory when the
graph is large? When the folder of the graph is about 300M, 1G memory
seems to be not enough for the program.
Cheers,
Qiuyan
Quoting Craig Taverner :
> You mentioned that your also do queries or searches as part of the proces
You mentioned that your also do queries or searches as part of the process.
If the graph is growing in complexity, perhaps the queries are getting
slower?
On Mon, Jul 5, 2010 at 10:16 AM, Qiuyan Xu
wrote:
> I've just checked my code. It turns out that tx.finish() is already
> there and I use a no
I've just checked my code. It turns out that tx.finish() is already
there and I use a normal inserter.
In each program I try to insert 10 data and I restart the
transaction every 500 insertions. The problem is, when I execute the
program for about the 15th time, it runs slowly shortly af
Just throwing this out there (since people have made mistakes with this
before):
Remember that commit is spelled:
tx.success(); tx.finish();
If you forget tx.finish(); the transaction will still be alive, and continue
to grow and eat more memory.
Cheers,
Tobias
On Sun, Jul 4, 2010 at 4:33 PM, Q
Thanks for the answer. I can't remember exactly which inserter I use
currently. I will check it tomorrow. But I do commit and restart a new
transaction after a fixed number of insertions.
At the same time, I execute some queries on the database while I
insert the data, since such insertions
Do you use the Batchinserter or a normal Transaction?
When using a normal Transaction to insert huge amounts of data I always
submit and create a new transaction every X Items. This keeps the
transaction small and reduces the memory used.
cheers Martin
On Sun, Jul 4, 2010 at 4:13 PM, wrote:
> H
Hallo,
I'm currently working with neo4j database and want to insert a bunch
of data into it.
At the very beginning the program works quite well. But as more data
has been inserted into the database, the insertion runs more and more
slowly and I noticed that the program consumes really a lot
8 matches
Mail list logo