Hi Stephan,

You could try lower the heap size to -Xmx2G and cache_type=weak with
10G memory mapped for relationships. The machine only has 16G RAM and
will not be able to process such a large dataset at in-memory speeds.

Another option is to calculate degree at insertion time and store it
as a property on each node.

Regards,
Johan

On Wed, Sep 21, 2011 at 12:44 PM, st3ven <st3...@web.de> wrote:
> Hi Linan,
>
> I just tried it with the outgoing relationships, but unfortunately that
> didn't speed things up.
>
> The size of my db is around 140GB and so it is not possible for me to dumb
> the full directory into a ramfs.
> My files on the hard disk have the following size:
> neostore.nodestore.db = 31MB
> neostore.relationshipstore.db = 85GB
> neostore.propertystore.db = 65GB
> neostore.propertystore.db.strings = 180MB
> Is there maybe a chance of reducing the size of my database?
>
> Cheers,
> Stephan
>
> --
> View this message in context: 
> http://neo4j-community-discussions.438527.n3.nabble.com/Creating-a-graph-database-with-BatchInserter-and-getting-the-node-degree-of-every-node-tp3351599p3355074.html
> Sent from the Neo4j Community Discussions mailing list archive at Nabble.com.
_______________________________________________
Neo4j mailing list
User@lists.neo4j.org
https://lists.neo4j.org/mailman/listinfo/user

Reply via email to