I've got a really odd situation that I'm not understanding.  I figured
the gurus could explain it to me.

I have two Tomcat instances deployed as services on the same 64bit
Windows 2008 machine with Sun JDK 1.6.0_13 64bit.

They are running the same apps but for different user bases and were
deployed per the Tomcat doc on running multiple instances.

The only difference in their configs, besides the obvious file pointers,
is the memory settings.

 

Instance 1 is -Xmx=5000M -Xms5000M

Instance 2 is -Xmx=4096M -Xms3092M  (yes, I know, it's best to make them
the same)

Neither has a Thread Stack Size set.

 

When I did a thread dump on each instance, I noticed that the PermGen
space for each instance was different.  Even more interesting, the
instance with the smaller memory settings had the larger PermGen.

 

Instance 1:

Heap

 PSYoungGen      total 1622976K, used 714557K [0x0000000155950000,
0x00000001bdbf0000, 0x00000001bdbf0000)

  eden space 1534528K, 44% used
[0x0000000155950000,0x000000017f3795b0,0x00000001b33e0000)

  from space 88448K, 36% used
[0x00000001b8590000,0x00000001ba536148,0x00000001bdbf0000)

  to   space 83648K, 0% used
[0x00000001b33e0000,0x00000001b33e0000,0x00000001b8590000)

 PSOldGen        total 3413376K, used 867706K [0x00000000853f0000,
0x0000000155950000, 0x0000000155950000)

  object space 3413376K, 25% used
[0x00000000853f0000,0x00000000ba34ead0,0x0000000155950000)

 PSPermGen       total 43712K, used 43613K [0x000000007fff0000,
0x0000000082aa0000, 0x00000000853f0000)

  object space 43712K, 99% used
[0x000000007fff0000,0x0000000082a875d8,0x0000000082aa0000)

 

Instance 2:

Heap

 PSYoungGen      total 989312K, used 149224K [0x000000012fea0000,
0x000000016fea0000, 0x00000001853f0000)

  eden space 932544K, 15% used
[0x000000012fea0000,0x0000000138c55de0,0x0000000168d50000)

  from space 56768K, 7% used
[0x0000000168d50000,0x0000000169154480,0x000000016c4c0000)

  to   space 54400K, 0% used
[0x000000016c980000,0x000000016c980000,0x000000016fea0000)

 PSOldGen        total 2097152K, used 174926K [0x00000000853f0000,
0x00000001053f0000, 0x000000012fea0000)

  object space 2097152K, 8% used
[0x00000000853f0000,0x000000008fec3a20,0x00000001053f0000)

 PSPermGen       total 86016K, used 45979K [0x000000007fff0000,
0x00000000853f0000, 0x00000000853f0000)

  object space 86016K, 53% used
[0x000000007fff0000,0x0000000082cd6ef8,0x00000000853f0000)

 

Obviously, I'm concerned about that 99% used value in the first
instance, and will be going back to address it.  However, I would have
thought that it would have the same  size as instance 2. 

 

Can someone explain what's happening here that I'm not seeing?  I didn't
think that the PermGen grew or shrunk over time.

 

Other than the heap size differences, the only other thing that's
different is that #1 has been running a day or two longer than #2.

 

=================================================================
Jeffrey W. Janner            e-mail: jeffrey.jan...@polydyne.com
<mailto:jeffrey.jan...@polydyne.com> 
PolyDyne Software Inc.          web: http://www.polydyne.com/
<http://www.polydyne.com/> 
9390 Research Blvd.           phone: (512) 343-9100 x8930
Building 1, Suite 400           fax: (512) 343-9297
Austin, TX 78759
=================================================================

 


*******************************  NOTICE  *********************************
This message is intended for the use of the individual or entity to which 
it is addressed and may contain information that is privileged, 
confidential, and exempt from disclosure under applicable law.  If the 
reader of this message is not the intended recipient or the employee or 
agent responsible for delivering this message to the intended recipient, 
you are hereby notified that any dissemination, distribution, or copying 
of this communication is strictly prohibited.  If you have received this 
communication in error, please notify us immediately by reply or by 
telephone (call us collect at 512-343-9100) and immediately delete this 
message and all its attachments.

Reply via email to