Please don't cross-post to the dev list too, that
list is for internal Solr/Lucene development discussions.

As Jay says, "it depends". The general advice is that
you should use a single index (perhaps replicated)
before you start trying to go the distributed route
(sharding). It's like asking "how big is a Java
program?"...

As for "how many documents", you need to hook up
a stress test process (say jMeter or SolrMeter) and
fire requests at your server at your target speed
to determine whether you have too many documents
on the server or not. And it also depends upon
what your hardware is. As asked, your question
is completely unanswerable.

I've seen 40-50M documents on a single machine.
And that wasn't necessarily a very big one.

Best
Erick

2011/9/29 Pengkai Qin <[email protected]>:
> Hi all,
>
> Now I'm doing research on solr distributed search, and it is said documents
> more than one million is reasonable to use distributed search.
> So I want to know, does anyone have the test result(Such as time cost) of
> using single index and distributed search of more than one million data? I
> need the test result very urgent, thanks in advance!
>
> Best Regards,
> Pengkai
>
>
>
>
>

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to