[ http://jira.jboss.com/jira/browse/JBAS-88?page=comments#action_12310826 ]
     
SourceForge User commented on JBAS-88:
--------------------------------------


SourceForge Username: hal200 .
Logged In: YES 
user_id=907916

I'm getting the same problem with JBoss 3.2.1_Tomcat-4.1.24.
In my case, it seems to be triggered by a strange
interaction with the Struts libraries.  I threw together a
quick script which continuously re-deploys the
struts-example.war file over and over again (with a 15
second pause between deploy/undeployments) by popping the
file in and out of the deploy directory (instead of using
the deployer.jar tool...not sure at this point if that would
make a difference)

Sure enough, when watching the GC output, the memory use
keeps increasing.  If I stop the script, the memory doesn't
seem to ever get reclaimed.  

If I leave it running, eventually the server blows it's heap
and I start getting OutOfMemory errors at random intervals.

-------------------------------------------------------------------------------
17:05:45,223 ERROR [URLDeploymentScanner] Failed to deploy:
[EMAIL PROTECTED]
url=file:/home/jon/cvs-test/canarie/lib/jboss-3.2.1_tomcat-4.1.24/server/canarie/deploy/IQXWeb.war,
deployedLastModified=0 }
org.jboss.deployment.DeploymentException: Could not create
deployment:
file:/home/jon/cvs-test/canarie/lib/jboss-3.2.1_tomcat-4.1.24/server/canarie/deploy/IQXWeb.war;
- nested throwable: (java.lang.OutOfMemoryError)
        at
org.jboss.deployment.MainDeployer.start(MainDeployer.java:853)
        at
org.jboss.deployment.MainDeployer.deploy(MainDeployer.java:640)
        at
org.jboss.deployment.MainDeployer.deploy(MainDeployer.java:613)
        at
sun.reflect.GeneratedMethodAccessor22.invoke(Unknown Source)
        at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:324)
        at
org.jboss.mx.capability.ReflectedMBeanDispatcher.invoke(ReflectedMBeanDispatcher.java:284)
        at
org.jboss.mx.server.MBeanServerImpl.invoke(MBeanServerImpl.java:549)
        at
org.jboss.mx.util.MBeanProxyExt.invoke(MBeanProxyExt.java:177)
        at $Proxy7.deploy(Unknown Source)
        at
org.jboss.deployment.scanner.URLDeploymentScanner.deploy(URLDeploymentScanner.java:302)
        at
org.jboss.deployment.scanner.URLDeploymentScanner.scan(URLDeploymentScanner.java:476)
        at
org.jboss.deployment.scanner.AbstractDeploymentScanner$ScannerThread.doScan(AbstractDeploymentScanner.java:200)
        at
org.jboss.deployment.scanner.AbstractDeploymentScanner$ScannerThread.loop(AbstractDeploymentScanner.java:211)
        at
org.jboss.deployment.scanner.AbstractDeploymentScanner$ScannerThread.run(AbstractDeploymentScanner.java:190)
Caused by: java.lang.OutOfMemoryError

-------------------------------------------------------------------------------

After leaving it for about an hour in this state, with GC
details turned on, we can see that sure enough, the memory
is not being reclaimed, despite the JVM's best efforts...

Full GC [Tenured: 58303K->58303K(58304K), 0.3335490 secs]
64404K->64354K(64832K), 0.3336620 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3131040 secs]
64407K->64354K(64832K), 0.3131720 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3122840 secs]
64397K->64354K(64832K), 0.3123480 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3455040 secs]
64397K->64354K(64832K), 0.3455610 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3205630 secs]
64397K->64354K(64832K), 0.3206770 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3148190 secs]
64398K->64354K(64832K), 0.3148690 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3124290 secs]
64397K->64354K(64832K), 0.3125430 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3098440 secs]
64397K->64354K(64832K), 0.3099050 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3316150 secs]
64397K->64354K(64832K), 0.3317300 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3237010 secs]
64397K->64354K(64832K), 0.3238340 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3266160 secs]
64397K->64354K(64832K), 0.3266670 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.4335870 secs]
64397K->64354K(64832K), 0.4336530 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3097620 secs]
64397K->64354K(64832K), 0.3098310 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3138060 secs]
64398K->64354K(64832K), 0.3139190 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3125410 secs]
64397K->64354K(64832K), 0.3125970 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3188110 secs]
64397K->64354K(64832K), 0.3189260 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3175270 secs]
64397K->64354K(64832K), 0.3175970 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3120930 secs]
64397K->64354K(64832K), 0.3121500 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3116510 secs]
64397K->64354K(64832K), 0.3117170 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3098090 secs]
64397K->64354K(64832K), 0.3098610 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3256830 secs]
64398K->64354K(64832K), 0.3257470 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3076290 secs]
64397K->64354K(64832K), 0.3076920 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3166820 secs]
64397K->64354K(64832K), 0.3167470 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.5335920 secs]
64397K->64354K(64832K), 0.5336590 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3426230 secs]
64397K->64354K(64832K), 0.3427690 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.4453350 secs]
64397K->64354K(64832K), 0.4453950 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3081930 secs]
64397K->64354K(64832K), 0.3082590 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3183000 secs]
64398K->64354K(64832K), 0.3183600 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3121360 secs]
64397K->64354K(64832K), 0.3121970 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3134370 secs]
64397K->64354K(64832K), 0.3134990 secs]
[Full GC [Tenured: 58303K->58303K(58304K), 0.3156870 secs]
64397K->64354K(64832K), 0.3157560 secs]



> Out of Memory eception after many redepl
> ----------------------------------------
>
>          Key: JBAS-88
>          URL: http://jira.jboss.com/jira/browse/JBAS-88
>      Project: JBoss Application Server
>         Type: Bug
>     Versions: JBossAS-3.2.6 Final
>     Reporter: SourceForge User
>     Assignee: Scott M Stark

>
>
> SourceForge Submitter: nobody .
> This behavior has been observed on Windows 2000 SP2, 
> running Sun JDK 1.3.1 and jBoss2.2.2 both Tomcat and 
> Jetty bundles.
> After many EAR hot redeployments in a short period of 
> time, (prehaps one every 5 to 20 minutes over a period 
> of 6 hours) a redeployment will result in "Out of 
> Memory" exceptions being thrown. 
> This happens with both the Tomcat and Jetty bundles, 
> but the error is much more clear with Jetty. In the 
> case of Tomcat, it appears to redeploy correctly but 
> creates a variety of errors when the app is run.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://jira.jboss.com/jira/secure/Administrators.jspa
-
If you want more information on JIRA, or have a bug to report see:
   http://www.atlassian.com/software/jira



-------------------------------------------------------
SF email is sponsored by - The IT Product Guide
Read honest & candid reviews on hundreds of IT Products from real users.
Discover which products truly live up to the hype. Start reading now. 
http://productguide.itmanagersjournal.com/
_______________________________________________
JBoss-Development mailing list
[EMAIL PROTECTED]
https://lists.sourceforge.net/lists/listinfo/jboss-development

Reply via email to