Hi,

Some options:
* Yes, on the slave/search side you can reduce your cache sizes and lower the 
memory footprint.
* You can also turn off norms in various fields if you don't need that and save 
memory there.
* You can increase your Xmx

I don't know what version of Solr you have, but look through Lucene/Solr's 
CHANGES.txt to see if there were any changes that affect memory requirements 
since your version of Solr.

Otis
----

Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/


>________________________________
>From: Steve Fatula <compconsult...@yahoo.com>
>To: "solr-user@lucene.apache.org" <solr-user@lucene.apache.org>
>Sent: Wednesday, November 9, 2011 3:33 PM
>Subject: Out of memory, not during import or updates of the index
>
>We get at rare times out of memory errors during the day. I know one reason 
>for this is data imports, none are going on. I see in the wiki, document adds 
>have some quirks, not doing that. I don't know to to expect for memory use 
>though.
>
>We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
>effect on memory, that's set to 30,000 for filter, document and queryResult. 
>Have experimented with different sizes for a while, these limits are all lower 
>than we used to have them set to. So, hoping there no sort of memory leak 
>involved.
>
>In any case, some of the messages are:
>
>Exception in thread "http-8080-21" java.lang.OutOfMemoryError: Java heap space
>
>
>Some look like this:
>
>Exception in thread "http-8080-22" java.lang.NullPointerException
>        at 
>java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
>...
>
>I presume the null pointer is a result of being out of memory. 
>
>Should Solr possibly need more than 2GB? What else can we tune that might 
>reduce memory usage?
>
>

Reply via email to