Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Andre Bois-Crettez

Using Solr 3.4.0. That changelog actually says it should reduce memory usage 
for that version. We were on a much older version previously, 1.something.
Norms are off on all fields that it can be turned off on.
I'm just hoping this new version doesn't have any leaks. Does FastLRUCache vs 
LRUCache make any memory difference?


You can add JVM parameters to better trace the heap usage with 
-XX:+PrintGCDetails -verbose:gc -Xloggc:/your/gc/logfile


Graphing that over time may help you see if you are constantly near the 
limit, or only at particular times, and try to correlate that to other 
operations (insertions, commit, optimize, ...)



--
André Bois-Crettez

Search technology, Kelkoo
http://www.kelkoo.com/



Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Paul Libbrecht
Steve,

do you have any custom code in your Solr?
We had out-of-memory errors just because of that, I was using one method to 
obtain the request which was leaking... had not read javadoc carefully enough. 
Since then, no leak.

What do you do after the OoME?

paul


Le 9 nov. 2011 à 21:33, Steve Fatula a écrit :

 We get at rare times out of memory errors during the day. I know one reason 
 for this is data imports, none are going on. I see in the wiki, document adds 
 have some quirks, not doing that. I don't know to to expect for memory use 
 though.
 
 We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
 effect on memory, that's set to 30,000 for filter, document and queryResult. 
 Have experimented with different sizes for a while, these limits are all 
 lower than we used to have them set to. So, hoping there no sort of memory 
 leak involved.
 
 In any case, some of the messages are:
 
 Exception in thread http-8080-21 java.lang.OutOfMemoryError: Java heap space
 
 
 Some look like this:
 
 Exception in thread http-8080-22 java.lang.NullPointerException
 at 
 java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
 ...
 
 I presume the null pointer is a result of being out of memory. 
 
 Should Solr possibly need more than 2GB? What else can we tune that might 
 reduce memory usage?



Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula


From: Paul Libbrecht p...@hoplahup.net
To: solr-user@lucene.apache.org
Sent: Thursday, November 10, 2011 7:19 AM
Subject: Re: Out of memory, not during import or updates of the index

do you have any custom code in your Solr?
We had out-of-memory errors just because of that, I was using one method to 
obtain the request which was leaking... had not read javadoc carefully enough. 
Since then, no leak.

There is no custom code as Solr is called via http. So, only the web interface 
is used. So, it can't be our code since Solr runs within it's own tomcat 
instance, and only Solr.

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula
From: Andre Bois-Crettez andre.b...@kelkoo.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Thursday, November 10, 2011 7:02 AM
Subject: Re: Out of memory, not during import or updates of the index

You can add JVM parameters to better trace the heap usage with 
-XX:+PrintGCDetails -verbose:gc -Xloggc:/your/gc/logfile

Graphing that over time may help you see if you are constantly near the limit, 
or only at particular times, and try to correlate that to other operations 
(insertions, commit, optimize, ...)


That would be true, except, there are NO insertions, deletions, updates, etc. 
as that is done in the middle of the night, long before the problem occurs. It 
is done using the data import manager. Right now, for example, we've raised 
the limit to 2.5GB and currently, 2GB is free. The only activity is searches, 
using the http interface, nothing we code in java, etc. So, the only thing 
consuming memory within Tomcat is solr, the only app.

So, since the caches are all full, and, 2GB of 2.5GB is free, yet, the other 
day, all 2GB were consumed and out of memory, there is something that consumed 
that 1.5GB freespace.

I did change the garbage collector today to the parallel one from the default 
one. Should have been in the first place. Not sure if this will matter or not 
as far as running out of space. I do have GC log as well (now). There is only 
one collection every minute or so, and in 11 hours, not one full gc.

Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Mark Miller
How big is your index?

What kind of queries do you tend to see? Do you facet on a lot of fields? Sort 
on a lot of fields?

Before you get the OOM and are running along nicely, how much RAM is used?

On Nov 9, 2011, at 3:33 PM, Steve Fatula wrote:

 We get at rare times out of memory errors during the day. I know one reason 
 for this is data imports, none are going on. I see in the wiki, document adds 
 have some quirks, not doing that. I don't know to to expect for memory use 
 though.
 
 We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
 effect on memory, that's set to 30,000 for filter, document and queryResult. 
 Have experimented with different sizes for a while, these limits are all 
 lower than we used to have them set to. So, hoping there no sort of memory 
 leak involved.
 
 In any case, some of the messages are:
 
 Exception in thread http-8080-21 java.lang.OutOfMemoryError: Java heap space
 
 
 Some look like this:
 
 Exception in thread http-8080-22 java.lang.NullPointerException
 at 
 java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
 ...
 
 I presume the null pointer is a result of being out of memory. 
 
 Should Solr possibly need more than 2GB? What else can we tune that might 
 reduce memory usage?

- Mark Miller
lucidimagination.com













Re: Out of memory, not during import or updates of the index

2011-11-10 Thread Steve Fatula
From: Mark Miller markrmil...@gmail.com
To: solr-user solr-user@lucene.apache.org
Sent: Thursday, November 10, 2011 3:00 PM
Subject: Re: Out of memory, not during import or updates of the index

How big is your index?

The total for the data dir is 651M.


What kind of queries do you tend to see? Do you facet on a lot of fields? Sort 
on a lot of fields?

If a query is sorted, it's on one field. A fair amount of faceting. Not sure 
how to answer what kind of queries, various kinds, all dismax. Remember, - 2GB 
ram was allotted to Solr/Tomcat. Most of the queries run in very small 
fractions of a second, the longest query we have runs in 140ms. Most are 1ms.


Before you get the OOM and are running along nicely, how much RAM is used?

I wish I was sitting there just before that happened. Now that we have the GC 
log, it will be easier to tell I suppose.

Out of memory, not during import or updates of the index

2011-11-09 Thread Steve Fatula
We get at rare times out of memory errors during the day. I know one reason for 
this is data imports, none are going on. I see in the wiki, document adds have 
some quirks, not doing that. I don't know to to expect for memory use though.

We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
effect on memory, that's set to 30,000 for filter, document and queryResult. 
Have experimented with different sizes for a while, these limits are all lower 
than we used to have them set to. So, hoping there no sort of memory leak 
involved.

In any case, some of the messages are:

Exception in thread http-8080-21 java.lang.OutOfMemoryError: Java heap space


Some look like this:

Exception in thread http-8080-22 java.lang.NullPointerException
        at 
java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
...

I presume the null pointer is a result of being out of memory. 

Should Solr possibly need more than 2GB? What else can we tune that might 
reduce memory usage?

Re: Out of memory, not during import or updates of the index

2011-11-09 Thread Otis Gospodnetic
Hi,

Some options:
* Yes, on the slave/search side you can reduce your cache sizes and lower the 
memory footprint.
* You can also turn off norms in various fields if you don't need that and save 
memory there.
* You can increase your Xmx

I don't know what version of Solr you have, but look through Lucene/Solr's 
CHANGES.txt to see if there were any changes that affect memory requirements 
since your version of Solr.

Otis


Sematext :: http://sematext.com/ :: Solr - Lucene - Nutch
Lucene ecosystem search :: http://search-lucene.com/



From: Steve Fatula compconsult...@yahoo.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Wednesday, November 9, 2011 3:33 PM
Subject: Out of memory, not during import or updates of the index

We get at rare times out of memory errors during the day. I know one reason 
for this is data imports, none are going on. I see in the wiki, document adds 
have some quirks, not doing that. I don't know to to expect for memory use 
though.

We had Solr running under Tomcat set to 2G ram. I presume cache size has an 
effect on memory, that's set to 30,000 for filter, document and queryResult. 
Have experimented with different sizes for a while, these limits are all lower 
than we used to have them set to. So, hoping there no sort of memory leak 
involved.

In any case, some of the messages are:

Exception in thread http-8080-21 java.lang.OutOfMemoryError: Java heap space


Some look like this:

Exception in thread http-8080-22 java.lang.NullPointerException
        at 
java.util.concurrent.ConcurrentLinkedQueue.offer(ConcurrentLinkedQueue.java:273)
...

I presume the null pointer is a result of being out of memory. 

Should Solr possibly need more than 2GB? What else can we tune that might 
reduce memory usage?



Re: Out of memory, not during import or updates of the index

2011-11-09 Thread Steve Fatula
From: Otis Gospodnetic otis_gospodne...@yahoo.com
To: solr-user@lucene.apache.org solr-user@lucene.apache.org
Sent: Wednesday, November 9, 2011 2:51 PM
Subject: Re: Out of memory, not during import or updates of the index

Hi,

Some options:
* Yes, on the slave/search side you can reduce your cache sizes and lower the 
memory footprint.
* You can also turn off norms in various fields if you don't need that and 
save memory there.
* You can increase your Xmx

I don't know what version of Solr you have, but look through Lucene/Solr's 
CHANGES.txt to see if there were any changes that affect memory requirements 
since your version of Solr.





Using Solr 3.4.0. That changelog actually says it should reduce memory usage 
for that version. We were on a much older version previously, 1.something.

Norms are off on all fields that it can be turned off on.

I'm just hoping this new version doesn't have any leaks. Does FastLRUCache vs 
LRUCache make any memory difference?