Hi Andy,
We are currently indexing about 650,000 full-text books in per Solr/Lucene
index. We have 10 shards for a total of about 6.5 million documents and our
average response time is under a 2 seconds, but the slowest 1% of queries take
between 5-30 seconds. If you were searching only on
On Sat, 2010-08-14 at 03:24 +0200, andynuss wrote:
> Lets say that I am indexing large book documents broken into chapters. A
> typical book that you buy at amazon. What would be the approximate limit to
> the number of books that can be indexed slowly and searched quickly. The
> search unit wou
words,
> and divided into 50 chapters, each chapter streamed into a docid unit. So
> a
> search hit is a chapter.
>
> How do I find out more about sharding and SOLR?
>
> Andy
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/scalability-lim
.472066.n3.nabble.com/scalability-limit-in-terms-of-numbers-of-large-documents-tp1142517p1146449.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com.
-
To unsubscribe, e-mail: java-user-unsubscr
indexed slowly and searched quickly. The
> search unit would be a chapter, so assume that a book is divided into 15-50
> chapters. Any ideas?
>
> Andy
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/scalability-limit-in-terms-of-numbers-of-large-
into 15-50
chapters. Any ideas?
Andy
--
View this message in context:
http://lucene.472066.n3.nabble.com/scalability-limit-in-terms-of-numbers-of-large-documents-tp1142517p1142517.html
Sent from the Lucene - Java Users mailing list archive at Nabble.com