We are considering Solr to store events which will be added and deleted from
the index in a very fast rate. Solr will be used, in this case, to find the
right event we need to process (since they may have several attributes and
we may search the best match based on the query attributes). Our
understanding is that the common use cases are those wherein the read rate
is much higher than writes, and deletes are not as frequent, so we are not
sure Solr handles our use case very well or if it is the right fit. Given
that, I have a few questions:

1 - How does Solr/Lucene degrade with the fragmentation? That would probably
determine the rate at which we would need to optimize the index. I presume
that it depends on the rate of insertions and deletions, but would you have
any benchmark on this degradation? Or, in general, how has been your
experience with this use case?

2 - Optimizing seems to be a very expensive process. While optimizing the
index, how much does search performance degrade? In this case, having a huge
degradation would not allow us to optimize unless we switch to another copy
of the index while optimize is running.

3 - In terms of high availability, what has been your experience detecting
failure of master and having a slave taking over?

Thanks,
Rodrigo

Reply via email to