Dawg,
I have a similar setup, and this is what works for me. I have a field which
contains a timestamp. The timestamp is set to be identical for all documents
added/updated in a run. Whe the run is complete and some/many documents have
been overwritten then I can delete all un-updated documents
Hi,
There is a possibility that you ended up with documents with the same ID and
that you are overwriting docuements instead of writing new.
In any case, I would suggest you change your approach in case you have enough
disk space to keep two copies of indices:
1. use alias to read data from inde
I have an index with about a million documents. It is the backend for a
shopping cart system.
Sometimes the inventory gets out of sync with solr and the storefront
contains out of stock items.
So I setup a scheduled task on the server to run at 12am every morning to
delete the entire solr index.