Re: Garbage Collectors

2009-04-17 Thread Bill Au
I would also include the -XX:+HeapDumpOnOutOfMemoryError option to get
a heap dump when the JVM runs out of heap space.



On Thu, Apr 16, 2009 at 9:43 PM, Bryan Talbot btal...@aeriagames.comwrote:

 If you're using java 5 or 6 jmap is a useful tool in tracking down memory
 leaks.

 http://java.sun.com/javase/6/docs/technotes/tools/share/jmap.html

 jmap -histo:live pid

 will print a histogram of all live objects in the heap.  Start at the top
 and work your way down until you find something suspicious -- the trick is
 in knowing what is suspicious of course.


 -Bryan





 On Apr 16, 2009, at Apr 16, 3:40 PM, David Baker wrote:

  Otis Gospodnetic wrote:

 Personally, I'd start from scratch:
 -Xmx -Xms...

 -server is not even needed any more.

 If you are not using Java 1.6, I suggest you do.

 Next, I'd try to investigate why objects are not being cleaned up - this
 should not be happening in the first place.  Is Solr the only webapp
 running?


 Otis
 --
 Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



 - Original Message 

  From: David Baker dav...@mate1inc.com
 To: solr-user@lucene.apache.org
 Sent: Thursday, April 16, 2009 3:33:18 PM
 Subject: Garbage Collectors

 I have an issue with garbage collection on our solr servers.  We have an
 issue where the  old generation  never  gets cleaned up on one of our
 servers.  This server has a little over 2 million records which are updated
 every hour or so.  I have tried the parallel GC and the concurrent GC.  The
 parallel seems more stable for us, but both end up running out of memory.  
 I
 have increased the memory allocated to the servers, but this just seems to
 delay the problem.  My question is, what are the suggested options for 
 using
 the parallel GC.  Currently we are using something of this nature:

 -server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy
 -XX:+UseParallelOldGC -XX:GCTimeRatio=19 -XX:NewSize=128m
 -XX:SurvivorRatio=2 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr

 I am new to solr and GC tuning, so any advice is appreciated.

  Thanks for the reply, yes, solr is the only app running under this
 tomcat server. I will remove -server, and other options except the heap
 allocation options and see how it performs. Any suggestions on how to go
 about finding out why objects are not being cleaned up if these changes dont
 work?





Re: Garbage Collectors

2009-04-17 Thread Otis Gospodnetic

The only thing that comes to mind is running Solr under a profiler (e.g. 
YourKit) and figuring out which objects are not getting cleaned up and who's 
holding references to them.

 Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 
 From: David Baker dav...@mate1inc.com
 To: solr-user@lucene.apache.org
 Sent: Thursday, April 16, 2009 6:40:31 PM
 Subject: Re: Garbage Collectors
 
 Otis Gospodnetic wrote:
  Personally, I'd start from scratch:
  -Xmx -Xms...
  
  -server is not even needed any more.
  
  If you are not using Java 1.6, I suggest you do.
  
  Next, I'd try to investigate why objects are not being cleaned up - this 
 should not be happening in the first place.  Is Solr the only webapp running?
  
  
  Otis
  --
  Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch
  
  
  
  - Original Message 
   
  From: David Baker 
  To: solr-user@lucene.apache.org
  Sent: Thursday, April 16, 2009 3:33:18 PM
  Subject: Garbage Collectors
  
  I have an issue with garbage collection on our solr servers.  We have an 
 issue where the  old generation  never  gets cleaned up on one of our 
 servers.  
 This server has a little over 2 million records which are updated every hour 
 or 
 so.  I have tried the parallel GC and the concurrent GC.  The parallel seems 
 more stable for us, but both end up running out of memory.  I have increased 
 the 
 memory allocated to the servers, but this just seems to delay the problem.  
 My 
 question is, what are the suggested options for using the parallel GC.  
 Currently we are using something of this nature:
  
  -server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy 
  -XX:+UseParallelOldGC 
 -XX:GCTimeRatio=19 -XX:NewSize=128m -XX:SurvivorRatio=2 
 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr
  
  I am new to solr and GC tuning, so any advice is appreciated.
 
 Thanks for the reply, yes, solr is the only app running under this tomcat 
 server. I will remove -server, and other options except the heap allocation 
 options and see how it performs. Any suggestions on how to go about finding 
 out 
 why objects are not being cleaned up if these changes dont work?



Garbage Collectors

2009-04-16 Thread David Baker
I have an issue with garbage collection on our solr servers.  We have an 
issue where the  old generation  never  gets cleaned up on one of our 
servers.  This server has a little over 2 million records which are 
updated every hour or so.  I have tried the parallel GC and the 
concurrent GC.  The parallel seems more stable for us, but both end up 
running out of memory.  I have increased the memory allocated to the 
servers, but this just seems to delay the problem.  My question is, what 
are the suggested options for using the parallel GC.  Currently we are 
using something of this nature:


-server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy 
-XX:+UseParallelOldGC -XX:GCTimeRatio=19 -XX:NewSize=128m 
-XX:SurvivorRatio=2 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr


I am new to solr and GC tuning, so any advice is appreciated.


Re: Garbage Collectors

2009-04-16 Thread Otis Gospodnetic

Personally, I'd start from scratch:
-Xmx -Xms...

-server is not even needed any more.

If you are not using Java 1.6, I suggest you do.

Next, I'd try to investigate why objects are not being cleaned up - this should 
not be happening in the first place.  Is Solr the only webapp running?


Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 
 From: David Baker dav...@mate1inc.com
 To: solr-user@lucene.apache.org
 Sent: Thursday, April 16, 2009 3:33:18 PM
 Subject: Garbage Collectors
 
 I have an issue with garbage collection on our solr servers.  We have an 
 issue 
 where the  old generation  never  gets cleaned up on one of our servers.  
 This 
 server has a little over 2 million records which are updated every hour or 
 so.  
 I have tried the parallel GC and the concurrent GC.  The parallel seems more 
 stable for us, but both end up running out of memory.  I have increased the 
 memory allocated to the servers, but this just seems to delay the problem.  
 My 
 question is, what are the suggested options for using the parallel GC.  
 Currently we are using something of this nature:
 
 -server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy -XX:+UseParallelOldGC 
 -XX:GCTimeRatio=19 -XX:NewSize=128m -XX:SurvivorRatio=2 
 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr
 
 I am new to solr and GC tuning, so any advice is appreciated.



Re: Garbage Collectors

2009-04-16 Thread Bryan Talbot
If you're using java 5 or 6 jmap is a useful tool in tracking down  
memory leaks.


http://java.sun.com/javase/6/docs/technotes/tools/share/jmap.html

jmap -histo:live pid

will print a histogram of all live objects in the heap.  Start at the  
top and work your way down until you find something suspicious -- the  
trick is in knowing what is suspicious of course.



-Bryan




On Apr 16, 2009, at Apr 16, 3:40 PM, David Baker wrote:


Otis Gospodnetic wrote:

Personally, I'd start from scratch:
-Xmx -Xms...

-server is not even needed any more.

If you are not using Java 1.6, I suggest you do.

Next, I'd try to investigate why objects are not being cleaned up -  
this should not be happening in the first place.  Is Solr the only  
webapp running?



Otis
--
Sematext -- http://sematext.com/ -- Lucene - Solr - Nutch



- Original Message 


From: David Baker dav...@mate1inc.com
To: solr-user@lucene.apache.org
Sent: Thursday, April 16, 2009 3:33:18 PM
Subject: Garbage Collectors

I have an issue with garbage collection on our solr servers.  We  
have an issue where the  old generation  never  gets cleaned up on  
one of our servers.  This server has a little over 2 million  
records which are updated every hour or so.  I have tried the  
parallel GC and the concurrent GC.  The parallel seems more stable  
for us, but both end up running out of memory.  I have increased  
the memory allocated to the servers, but this just seems to delay  
the problem.  My question is, what are the suggested options for  
using the parallel GC.  Currently we are using something of this  
nature:


-server -Xmx4096m -Xms512m -XX:+UseAdaptiveSizePolicy -XX: 
+UseParallelOldGC -XX:GCTimeRatio=19 -XX:NewSize=128m - 
XX:SurvivorRatio=2 -Dsolr.solr.home=/usr/local/solr-tomcat-fi/solr


I am new to solr and GC tuning, so any advice is appreciated.

Thanks for the reply, yes, solr is the only app running under this  
tomcat server. I will remove -server, and other options except the  
heap allocation options and see how it performs. Any suggestions on  
how to go about finding out why objects are not being cleaned up if  
these changes dont work?