Hi,
I am using neo4j 1.9.7,SDN 2.2.0.RELEASE and haproxy 1.4.24 to run the
environment. I have 6 server,1 master and 5 slavers .Write to the all
server.
Now I always get these exception:
org.springframework.dao.DataRetrievalFailureException: 3189664; nested
exception is org.neo4j.graphdb.NotF
I also encountered this problem.My solution is that:
1.put the data import single neo4j.(one instance)
2.update some data(any one can)
3.start the ha cluster.
Note:
First startup the neo4j instance that import the data
在 2014年6月17日星期二UTC+8下午3时59分32秒,Mamta Thakur写道:
>
> Hi,
>
> I have b
Hi PM,
I don't know how your code is.So I am talking about my idea.
In the SDN,if you create node with cypher query language,the label must be
added _ . If you use save() ,SDN can add the label automatic without _ .The
SDN 3.0.1 default that label don't hava _ .
在 2014年6月12日星期四UTC+8下午7时41分35秒,
About the "You can also increase your heap or have the queries only work on
a batch of data."
If the data is very large,the increase will no limit. This is not a suit
way of doing.
Now I using spring data neo4j.I can use template to work on a batch of
data, but i use @Query to implement it.So I
I am sorry for that "rich "-relationship .It means that relationship has
properties.
Now I need to do it using one query . I remember that at neo4j manual,it
says that "This query isn’t for deleting large amounts of data, but is nice
when playing around with small example data sets" for DELETE.I
Hi there,
I use the neo4j 1.9.7 enterprise version.
Now I need to delete some graph. Following that :
START node=node(12346L)
MATCH node - [r2:FOWARD] - user - [r3] - user2
DELETE r3
WITH node,user
DELETE user
WITH node
MATCH node - [r1] - user1
DELETE r1
WITH node
DELETE node
When the data is