On 12/13/2010 9:46 PM, Cameron Hurst wrote:
When i start the server I am using about 90MB of RAM which is fine and
from the google searches I found that is normal. The issue comes when I
start indexing data. In my solrconf.xml file that my maximum RAM buffer
is 32MB. In my mind that means that the maximum RAM being used by the
servlet should be 122MB, but increasing to 150MB isn't out of my reach.
When I start indexing data and calling searches my memory usages slowly
keeps on increasing. The odd thing about it is that when I reindex the
exact same data set the memory usage increases every time but no new
data has been entered to be indexed. I stopped increasing as I went over
350MB of RAM.

There could be large gaps in my understanding here, but one thing I have noticed about Java is that memory usage on a program will increase until it nearly fills the max heap size it has been allocated. In order to increase performance, garbage collection seems to be rather lazy, until a large percentage of the max heap size is allocated. I've got a 2GB max heap size passed to Jetty when I start Solr. Memory usage hovers around 1.4GB, and it doesn't take very long for it to get there.

Solr's search functionality, especially if you give it a sort parameter, is memory hungry. For each field you sort on, Solr creates a large filter cache entry. The other caches are also filled quickly. If you are storing a large amount of data in Solr for each document, the documentCache in particular will get quite large. Every time you do a reindex, you are creating a new searcher with new caches. The old one is eventually removed, but I'm pretty sure that until garbage collection runs, the memory is not actually reclaimed.

I don't know what your heap size is set to, but I'd be surprised if it's less than 1GB. Java is not going to be concerned about memory usage when it's only using 350MB of that, so I don't think it'll even try to run garbage collection.

Shawn

Reply via email to