It keeps in memory state which is then written en-bloc to lucene

You'd flush to read from the index eg to find nodes to connect

Actually it internally auto-flushes after some million entries or so

Sent from mobile device

Am 27.01.2014 um 20:31 schrieb Javad Karabi <karabija...@gmail.com>:

> from 
> http://docs.neo4j.org/chunked/milestone/indexing-batchinsert.html#indexing-batchinsert-best-practices
>  :
> Try to avoid flushing too often because each flush will result in all 
> additions (since last flush) to be visible to the querying methods, and 
> publishing those changes can be a performance penalty.
> 
> Using a single threaded batch insertion, why would one flush index at all? 
> (except for at the end of course)
> Why not just use the index, and only flush at the very end of the import?
> -- 
> You received this message because you are subscribed to the Google Groups 
> "Neo4j" group.
> To unsubscribe from this group and stop receiving emails from it, send an 
> email to neo4j+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.

-- 
You received this message because you are subscribed to the Google Groups 
"Neo4j" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to neo4j+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to