Hi,

I'm running a read only index with SOLR 1.3 on a server with 8GB RAM and the 
Heap set to 6GB. The index contains 17 million documents and occupies 63GB of 
disc space with compression turned on. Replication frequency from the SOLR 
master is 5 minutes. The index should be able to support around 10 concurrent 
searches.

Now we start hitting RAM related errors like:

- java.lang.OutOfMemoryError: Java heap space or
- java.lang.OutOfMemoryError: GC overhead limit exceeded

which over time make the SOLR instance unresponsive.

Before asking for advices on how to optimize my setup, I'd kindly ask for your 
experiences with setups of this size. Is it possible to run such a large index 
on only one server? Can I support even larger indexes when I tweak my 
configuration? Where's the limit when I need to split the index on multiple 
shards? When do I need to start considering a setup like/with Katta?

Thanks for your insights,

Thomas Koch, http://www.koch.ro

Reply via email to