Awh, kinda not very fast inserting then. From looking at the code, this kinda sux for fast inserting since you need to go in and out of indexes. Not sure how to speed that up. Could you keep some of the index in memory maybe?
Cheers, /peter neubauer COO and Sales, Neo Technology GTalk: neubauer.peter Skype peter.neubauer Phone +46 704 106975 LinkedIn http://www.linkedin.com/in/neubauer Twitter http://twitter.com/peterneubauer http://www.neo4j.org - Your high performance graph database. http://www.tinkerpop.com - Processing for Internet-scale graphs. http://www.thoughtmade.com - Scandinavias coolest Bring-a-Thing party. On Tue, Mar 23, 2010 at 10:49 AM, Laurent Laborde <kerdez...@gmail.com> wrote: > nope. > "The LuceneIndexBatchInserter is designed for being performant when > inserting large amounts of data with minimal lookups from the index > during that time." > > problem : i do a lot of lookup while inserting :) > > -- > Ker2x > > > On Tue, Mar 23, 2010 at 10:43 AM, Peter Neubauer > <neubauer.pe...@gmail.com> wrote: >> Laurent, >> have you looked at the BatchInserter for doing big initial >> populations? It is MUCH faster than the normal Neo4j transactional >> approach, see http://wiki.neo4j.org/content/Indexing_with_BatchInserter >> >> Would that help? >> >> Cheers, >> >> /peter neubauer >> >> COO and Sales, Neo Technology >> >> GTalk: neubauer.peter >> Skype peter.neubauer >> Phone +46 704 106975 >> LinkedIn http://www.linkedin.com/in/neubauer >> Twitter http://twitter.com/peterneubauer >> >> http://www.neo4j.org - Your high performance graph database. >> http://www.tinkerpop.com - Processing for Internet-scale graphs. >> http://www.thoughtmade.com - Scandinavias coolest Bring-a-Thing party. >> >> >> >> On Mon, Mar 22, 2010 at 11:35 PM, Laurent Laborde <kerdez...@gmail.com> >> wrote: >>> On Fri, Mar 19, 2010 at 10:20 AM, Laurent Laborde <kerdez...@gmail.com> >>> wrote: >>>> thank you peter. >>>> Following the various link, i found http://www.cytoscape.org/ >>>> Look very promising !! i'll try this weekend :) >>> >>> 3 days later, i'm still populating neo4j like a crazy, trying to >>> compute the collatz conjecture up to 100 millions. >>> i'm at 63.5 millions, but it's now much much slower than at the beginning. >>> >>> probably because of the insane numbers of index.getSingleNode on >>> lucene on a growing index. >>> i'm expecting something like a billion of node when it will reach 100 >>> millions. >>> >>> the index is 6GB and the base is 9GB. >>> The cpu usage is now ~10% instead of 25%, i have a grand total of 9+6 >>> GB of data in the neo4j drectory with only 8GB on ram on my windows 7 >>> i'm now (random) IO bound. >>> >>> My SATA velociraptor 10krpm is fighting at 1MB/s (not so bad; >>> considering it's highly mixed R/W and purely random IO) >>> >>> time to buy a 32GB SSD :) >>> >>> -- >>> Keru >>> >>> >>> -- >>> Laurent "ker2x" Laborde >>> Sysadmin & DBA at http://www.over-blog.com/ >>> _______________________________________________ >>> Neo mailing list >>> User@lists.neo4j.org >>> https://lists.neo4j.org/mailman/listinfo/user >>> >> _______________________________________________ >> Neo mailing list >> User@lists.neo4j.org >> https://lists.neo4j.org/mailman/listinfo/user >> > > > > -- > Laurent "ker2x" Laborde > Sysadmin & DBA at http://www.over-blog.com/ > _______________________________________________ > Neo mailing list > User@lists.neo4j.org > https://lists.neo4j.org/mailman/listinfo/user > _______________________________________________ Neo mailing list User@lists.neo4j.org https://lists.neo4j.org/mailman/listinfo/user