As asked, that's really an unanswerable question. The math is pretty easy
in terms of running out of document IDs, but "searched quickly" depends
on too many variables.

I suspect, though, that long before you ran out of document IDs, you'd need
to shard your index, Have you looked at SOLR?

Best
Erick

On Fri, Aug 13, 2010 at 9:24 PM, andynuss <andrew_n...@yahoo.com> wrote:

>
> Hi,
>
> Lets say that I am indexing large book documents broken into chapters.  A
> typical book that you buy at amazon.  What would be the approximate limit
> to
> the number of books that can be indexed slowly and searched quickly.  The
> search unit would be a chapter, so assume that a book is divided into 15-50
> chapters.  Any ideas?
>
> Andy
> --
> View this message in context:
> http://lucene.472066.n3.nabble.com/scalability-limit-in-terms-of-numbers-of-large-documents-tp1142517p1142517.html
> Sent from the Lucene - Java Users mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: java-user-unsubscr...@lucene.apache.org
> For additional commands, e-mail: java-user-h...@lucene.apache.org
>
>

Reply via email to