rebecca,

i would suggest making sure you have some gc logging configured so you have 
some visibility into the JVM, esp if you don't already have JMX for sflow agent 
configured to give you external visibility of those internal metrics

the options below just print out the gc activity to a log

-Xloggc:gc.log
-verbose:gc 
-XX:+PrintGCDateStamps
-XX:+PrintGCTimeStamps
-XX:+PrintGCDetails
-XX:+PrintTenuringDistribution
-XX:+PrintClassHistogram 
-XX:+PrintHeapAtGC 
-XX:+PrintGCApplicationConcurrentTime
-XX:+PrintGCApplicationStoppedTime
-XX:+PrintPromotionFailure 
-XX:+PrintAdaptiveSizePolicy
-XX:+PrintTLAB
-XX:+UseGCLogFileRotation
-XX:NumberOfGCLogFiles=5
-XX:GCLogFileSize=10m




on the memory tuning side if things, as has already been mentioned, try to 
leave as much memory (outside the JVM) available to your OS to cache as much of 
the actual index as possible

in your case, you have a lot of RAM, so i would suggest starting with the gc 
logging options above, plus these very basic JVM memory settings
-XX:+UseG1GC
-Xms2G
-Xmx4G
-XX:+UseAdaptiveSizePolicy 
-XX:MaxGCPauseMillis=1000 
-XX:GCTimeRatio=19

in short, start by letting the JVM tune itself ;)

then start looking at the actual GC behavior (this will be visible in the gc 
logs)


---
on the OS performance monitoring, a few real time tools which i like to use on 
linux

nmon
dstat
htop

for trending start with the basics (sysstat/sar) 
and build from there (hsflowd is super easy to install and get pushing data up 
to a central console like ganglia)
you can add to that by adding the sflow JVM agent to your solr environment

enabling JMX interface on jetty will let you use tools like jconsole or 
jvisualvm




________________________________________
From: François Schiettecatte <fschietteca...@gmail.com>
Sent: Tuesday, February 24, 2015 17:06
To: solr-user@lucene.apache.org
Subject: Re: how to debug solr performance degradation

Rebecca

You don’t want to give all the memory to the JVM. You want to give it just 
enough for it to work optimally and leave the rest of the memory for the OS to 
use for caching data. Giving the JVM too much memory can result in worse 
performance because of GC. There is no magic formula to figuring out the memory 
allocation for the JVM, that is very dependent on the workload. In your case I 
would start with 5GB, and increment by 5GB with each run.

I also use these settings for the JVM

-XX:+UseG1GC -Xms1G -Xmx1G

-XX:+AggressiveOpts -XX:+OptimizeStringConcat -XX:+ParallelRefProcEnabled 
-XX:MaxGCPauseMillis=200

I got them from this list so can’t take credit for them but they work for me.


Cheers

François


> On Feb 24, 2015, at 7:45 PM, Tang, Rebecca <rebecca.t...@ucsf.edu> wrote:
>
> We gave the machine 180G mem to see if it improves performance.  However,
> after we increased the memory, Solr started using only 5% of the physical
> memory.  It has always used 90-something%.
>
> What could be causing solr to not grab all the physical memory (grabbing
> so little of the physical memory)?
>
>
> Rebecca Tang
> Applications Developer, UCSF CKM
> Industry Documents Digital Libraries
> E: rebecca.t...@ucsf.edu
>
>
>
>
>
> On 2/24/15 12:44 PM, "Shawn Heisey" <apa...@elyograg.org> wrote:
>
>> On 2/24/2015 1:09 PM, Tang, Rebecca wrote:
>>> Our solr index used to perform OK on our beta production box (anywhere
>>> between 0-3 seconds to complete any query), but today I noticed that the
>>> performance is very bad (queries take between 12 ­ 15 seconds).
>>>
>>> I haven't updated the solr index configuration
>>> (schema.xml/solrconfig.xml) lately.  All that's changed is the data ‹
>>> every month, I rebuild the solr index from scratch and deploy it to the
>>> box.  We will eventually go to incremental builds. But for now, all
>>> indexes are built from scratch.
>>>
>>> Here are the stats:
>>> Solr index size 183G
>>> Documents in index 14364201
>>> We just have single solr box
>>> It has 100G memory
>>> 500G Harddrive
>>> 16 cpus
>>
>> The bottom line on this problem, and I'm sure it's not something you're
>> going to want to hear:  You don't have enough memory available to cache
>> your index.  I'd plan on at least 192GB of RAM for an index this size,
>> and 256GB would be better.
>>
>> Depending on the exact index schema, the nature of your queries, and how
>> large your Java heap for Solr is, 100GB of RAM could be enough for good
>> performance on an index that size ... or it might be nowhere near
>> enough.  I would imagine that one of two things is true here, possibly
>> both:  1) Your queries are very complex and involve accessing a very
>> large percentage of the index data.  2) Your Java heap is enormous,
>> leaving very little RAM for the OS to automatically cache the index.
>>
>> Adding more memory to the machine, if that's possible, might fix some of
>> the problems.  You can find a discussion of the problem here:
>>
>> http://wiki.apache.org/solr/SolrPerformanceProblems
>>
>> If you have any questions after reading that wiki article, feel free to
>> ask them.
>>
>> Thanks,
>> Shawn
>>
>

Reply via email to