[Neo4j] Write Neo4j Books - Packt Publishing

2010-07-09 Thread Kshipra Singh
Hi All, I represent Packt Publishing, the publishers of computer related books. We are planning to extend our range of Open Source books based on Java technology and are currently inviting authors interested in writing them. This doesn't require any past writing experience. All that we

Re: [Neo4j] Write Neo4j Books - Packt Publishing

2010-07-09 Thread Laurent Laborde
On Fri, Jul 9, 2010 at 9:47 AM, Kshipra Singh kship...@packtpub.com wrote: Hi All, I represent Packt Publishing, the publishers of computer related books. We are planning to extend our range of Open Source books based on Java technology and are currently inviting authors interested in

Re: [Neo4j] How to traverse by the number of relationships between nodes?

2010-07-09 Thread Tim Jones
Hi Craig, That's great, thanks a lot. I'll give it a go. Cheers, Tim - Original Message From: Craig Taverner cr...@amanzi.com To: Neo4j user discussions user@lists.neo4j.org Sent: Thu, July 8, 2010 8:49:38 PM Subject: Re: [Neo4j] How to traverse by the number of relationships

[Neo4j] Can I use neo4j for this?

2010-07-09 Thread Gurkensalat
Dear all, I'm completely new to neo4j (and don't even really speak Java), but I have been struggling in vain for quite a while to get sensible performance on my graph-data in MySQL and PostgreSQL. From your webpage and other posts on the lists I got the great feeling that newbies are welcome

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Mattias Persson
Modifications in a transaction are kept in memory so that there's the ability to rollback the transaction completely if something would go wrong. There could of course be a solution where (I'm just spawning supposedly), so that if a tx gets big enough such a transaction gets converted into its own

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Marko Rodriguez
Hi, Would it actually be worth something to be able to begin a transaction which auto-committs stuff every X write operation, like a batch inserter mode which can be used in normal EmbeddedGraphDatabase? Kind of like: graphDb.beginTx( Mode.BATCH_INSERT ) ...so that you can start such

Re: [Neo4j] Can I use neo4j for this?

2010-07-09 Thread Laurent Laborde
On Fri, Jul 9, 2010 at 12:33 PM, gurkensa...@gmx.de wrote: Dear all, Dear you, I'm completely new to neo4j (and don't even really speak Java), I'm a sysadmin with poor object programing skill. It wasn't a problem to use neo4j, as the api is simple and clear. (No enterprisy

Re: [Neo4j] Is it possible to count common nodes when traversing?

2010-07-09 Thread Mattias Persson
Just to notify you guys on this... since as of now (r4717) the TraversalFactory class is named Traversal instead, so code would look like: for ( Node currentNode : TraversalFactory.description() .breadthFirst().uniqueness(Uniqueness.RELATIONSHIP_GLOBAL)

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Arijit Mukherjee
I've a similar problem. Although I'm not going out of memory yet, I can see the heap constantly growing, and JProfiler says most of it is due to the Lucene indexing. And even if I do the commit after every X transactions, once the population is finished, the final commit is done, and the graph db

Re: [Neo4j] Is it possible to count common nodes when traversing?

2010-07-09 Thread Mattias Persson
Sorry, it should be: for ( Node currentNode : Traversal.description() .breadthFirst().uniqueness( Uniqueness.RELATIONSHIP_GLOBAL) .relationships(MyRelationships.SIMILAR) .relationships(MyRelationships.CATEGORY)

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Rick Bullotta
Short answer is maybe. ;-) There are some cases where the transaction is an all or nothing scenario, others where incremental commits are OK. Having the ability to do incremental autocommits would be useful, however. In a perfect world, it could be based on a bucket (e.g. XXX transactions), a

Re: [Neo4j] How to traverse by the number of relationships between nodes?

2010-07-09 Thread Johan Svensson
Hi, I would not recommend to use large amounts of different (dynamically created) relationship types. It is better to use well defined relationship types with an additional property on the relationship whenever needed. The limit is actually not 64k but 2^31, but having large amounts of

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Paul A. Jackson
I confess I had not investigated the batch inserter. From the description it fits my requirements exactly. With respect to auto-commits, it seems there are two use cases. The first is every day operations that might run out of memory. In this case it might be nice for neo4j to swap out

Re: [Neo4j] How to traverse by the number of relationships between nodes?

2010-07-09 Thread Max De Marzi Jr.
Can you expand on this a bit... as to what the graph internals are doing: Option 1: You have colored relationships (RED, BLUE, GREEN, etc to 10k colors). From a random node, you traverse the graph finding all nodes that it is connected to via the PURPLE or FUSIA relationship. vs Option 2: You

Re: [Neo4j] OutOfMemory while populating large graph

2010-07-09 Thread Bill Janssen
Note that a couple of memory issues are fixed in Lucene 2.9.3. Leaking when indexing big docs, and indolent reclamation of space from the FieldCache. Bill Arijit Mukherjee ariji...@gmail.com wrote: I've a similar problem. Although I'm not going out of memory yet, I can see the heap