Steve,

do you have any custom code in your Solr?
We had out-of-memory errors just because of that, I was using one method to 
obtain the request which was leaking... had not read javadoc carefully enough. 
Since then, no leak.

What do you do after the OoME?

paul


Le 9 nov. 2011 à 21:33, Steve Fatula a écrit :

> We get at rare times out of memory errors during the day. I know one reason 
> for this is data imports, none are going on. I see in the wiki, document adds 
> have some quirks, not doing that. I don't know to to expect for memory use 
> though.
> 
> We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
> effect on memory, that's set to 30,000 for filter, document and queryResult. 
> Have experimented with different sizes for a while, these limits are all 
> lower than we used to have them set to. So, hoping there no sort of memory 
> leak involved.
> 
> In any case, some of the messages are:
> 
> Exception in thread "http-8080-21" java.lang.OutOfMemoryError: Java heap space
> 
> 
> Some look like this:
> 
> Exception in thread "http-8080-22" java.lang.NullPointerException
>         at 
> java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
> ...
> 
> I presume the null pointer is a result of being out of memory. 
> 
> Should Solr possibly need more than 2GB? What else can we tune that might 
> reduce memory usage?

Reply via email to