Hi,

I am interested in providing network metrics such as centrality, eigenvector 
centrality, degree, etc to graphs that I must assume will contain lots 
(millions+) of nodes. I am interested in any suggestions regarding the best way 
to approach this:

-          Is it reasonable to add these metrics as properties of the nodes? My 
thought here is that this would work nicely when exporting the graph as GraphML.
-          Can these metrics be maintained in the database over time, or should 
they be calculated as needed?
-          Does the calculation of a metric for a single node require 
traversing the entire graph (or at least the sub-graph it is connected to)? 
Does it depend on the metric being calculated?
-          If the answer is, yes - it take a long time to update a set of 
metrics, what are the typical solutions? Do we go down a path like we do with 
data warehousing where the graph is loaded from the operational store 
periodically in batches, and then becomes stale over time? What might be some 
solutions for graphs that are constantly updated - or is the tradeoff simply 
that to have metrics your entire graph must be updated after any update for the 
metrics to be valid? (For example - can a node be time stamped or something, or 
is it the case that any change to the graph can change the metrics for every 
other node?)

Thanks in advance.

-Paul




_______________________________________________
Neo4j mailing list
User@lists.neo4j.org
https://lists.neo4j.org/mailman/listinfo/user

Reply via email to