From: Andre Bois-Crettez <andre.b...@kelkoo.com>
>To: "solr-user@lucene.apache.org" <solr-user@lucene.apache.org>
>Sent: Thursday, November 10, 2011 7:02 AM
>Subject: Re: Out of memory, not during import or updates of the index
>
>You can add JVM parameters to better trace the heap usage with 
>-XX:+PrintGCDetails -verbose:gc -Xloggc:/your/gc/logfile
>
>Graphing that over time may help you see if you are constantly near the limit, 
>or only at particular times, and try to correlate that to other operations 
>(insertions, commit, optimize, ...)
>
>
>That would be true, except, there are NO insertions, deletions, updates, etc. 
>as that is done in the middle of the night, long before the problem occurs. It 
>is done using the data import manager. Right now, for example, we've raised 
>the limit to 2.5GB and currently, 2GB is free. The only activity is searches, 
>using the http interface, nothing we code in java, etc. So, the only thing 
>consuming memory within Tomcat is solr, the only app.

So, since the caches are all full, and, 2GB of 2.5GB is free, yet, the other 
day, all 2GB were consumed and out of memory, there is something that consumed 
that 1.5GB freespace.

I did change the garbage collector today to the parallel one from the default 
one. Should have been in the first place. Not sure if this will matter or not 
as far as running out of space. I do have GC log as well (now). There is only 
one collection every minute or so, and in 11 hours, not one full gc.

Reply via email to