Michael wrote:
I've got a process external to Solr that is constantly feeding it new
documents, retrying if Solr is nonresponding.  What's the right way to
stop Solr (running in Tomcat) so no documents are lost?

Currently I'm committing all cores and then running catalina's stop
script, but between my commit and the stop, more documents can come in
that would need *another* commit...

Lots of people must have had this problem already, so I know the
answer is simple; I just can't find it!

Thanks.
Michael
I don't know if this is the best solution, or even if it's applicable to your situation but we do incremental updates from a database based on a timestamp, (from a simple seperate sql table filled by triggers so deletes are measures correctly as well). We store this timestamp in solr as well. Our index script first does a simple Solr request to request the newest timestamp and basically selects the documents to update with a "SELECT * FROM document_updates WHERE timestamp >= X" where X is the timestamp returned from Solr (We use >= for the hopefully extremely rare case when two updates are at the same time and also at the same time the index script is run where it only retrieved one of the updates, this will cause some documents to be updates multiple times but as document updates are idempotent this is no real problem.)

Regards,

gwk

Reply via email to