What Erick said. That's a giant Filter Cache. Have a look at these Solr
metrics and note the Filter Cache in the middle:
http://www.flickr.com/photos/otis/8409088080/
Note how small the cache is and how high the hit rate is. Those are stats
for http://search-lucene.com/ and http://search-hadoop.com/ where you can
see facets on the right that and up being used as filter queries. Most
Solr apps I've seen had small Filter Caches.
Otis
--
Performance Monitoring * Log Analytics * Search Analytics
Solr Elasticsearch Support * http://sematext.com/
On Wed, Mar 5, 2014 at 3:34 PM, Erick Erickson erickerick...@gmail.comwrote:
This, BTW, is an ENORMOUS number cached queries.
Here's a rough guide:
Each entry will be (length of query) + maxDoc/8 bytes long.
Think of the filterCache as a map where the key is the query
and the value is a bitmap large enough to hold maxDoc bits.
BTW, I'd kick this back to the default (512?) and periodically check
it with the adminplugins/stats page to see what kind of hit ratio
I have and adjust from there.
Best,
Erick
On Mon, Mar 3, 2014 at 11:00 AM, Benjamin Wiens
benjamin.wi...@gmail.com wrote:
How can we calculate how much heap memory the filter cache will consume?
We
understand that in order to determine a good size we also need to
evaluate
how many filterqueries would be used over a certain time period.
Here's our setting:
filterCache
class=solr.FastLRUCache
size=30
initialSize=30
autowarmCount=5/
According to the post below, 53 GB of RAM would be needed just by the
filter cache alone with 1.4 million Docs. Not sure if this true and how
this would work.
Reference:
http://stackoverflow.com/questions/2004/solr-filter-cache-fastlrucache-takes-too-much-memory-and-results-in-out-of-mem
We filled the filterquery cache with Solr Meter and had a JVM Heap Size
of
far less than 53 GB.
Can anyone chime in and enlighten us?
Thank you!
Ben Wiens Benjamin Mosior