Another thing I noticed:

When uploading a file with about 8000 thesaurus relations, and although I 
used a construct like 

USING PERIODIC COMMIT
LOAD CSV FROM  "http:/myserver.com/neo4j/eurovoc_broader.csv " AS csvLine
MATCH (t:Term { id: toInt(csvLine[0])}),(bt:Term { id: toInt( csvLine[1])})
CREATE (t)-[:BT]->(bt)

I got "Unknown errors" quite often. I really had to find out by trial and 
error what the maximum size was.  If i went over it, I got an  "Unknown 
error"  and had to start all over again. Would be very nice to have a 
setting that allows me to force a commit after a certain number of 
processed rows and just let the server do its work, regardless of the 
length of the csv file.  In my case, I had to cut the file into 17 separate 
csv's and feed them one by one to the server.




-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to