-----BEGIN PGP SIGNED MESSAGE----- Hash: SHA1 Kapilok,
On 4/9/2010 9:46 AM, kapilok wrote: > 1. Start Tomcat (with maxThreads="40" ) > 2. Run JMeter Load (40 concurrent with some ramp up) > - All requests succeed Good. > 3. Now load the database with some heavy process, so CPU consumption is high > 4. Run same jMeter load; now response times are slow This should be no surprise, as CPU time be scarce. > 5. Get ThreadPool Full Error; Take Thread Dump This also shouldn't be a surprise: with maxThreads="40", you can only handle 40 simultaneous connections. If you have 1+ "heavy process" taking connections plus your 40 incoming ones, some will be denied. Or, are you saying that you are running a CPU-intensive process outside of Tomcat on the db server? > 6. Kill all JMeter requests; so Tomcat can breathe > 7. Wait 10 minutes. Try login to webapp - Cant login. Browser does not > display Login page or any other page. Sounds like deadlock. > 8. Take another Thread Dump. Its the same. NOTE: Main Thread and > TP-Processor4 are still running (observed with VisualVM1.1 on JDK 6) > 9. Try running jMeter load. No reponse - No requests going through. > 10. Waited 30 minutes and had no option but to bounce Tomcat. Yep, sounds like deadlock to me. > Bottom Line: What can I do so I don't have to bounce Tomcat when I run into > this situation? If you really have deadlocked your system, you have no choice but to restart Tomcat. Are you using a JDBC connection pool? If so, what kind, and under what configuration? > Partial Thread Dump > ******************* > "http-9080-Processor40" daemon prio=6 tid=0x4d1ba000 nid=0x1e0 in > Object.wait() [0x5162e000..0x5162fd94] > java.lang.Thread.State: WAITING (on object monitor) > at java.lang.Object.wait(Native Method) > - waiting on <0x07b12648> (a > com.mchange.v2.resourcepool.BasicResourcePool) > [...] > org.springframework.jdbc.datasource.DataSourceUtils.getConnection(DataSourceUtils.java:79) > at > org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:523) > at > org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:587) > at > org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:612) > at > org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:644) > ... If the above thread never makes any progress, then it's likely that you have exhausted your connection pool. Usually this situation would correct itself, because one (or more) connections would eventually be returned to the pool, and become available for other clients. If this is a deadlock scenario centering around the JDBC connection pool, then you likely have a situation in your code where you do this: Connection c1 = pool.getConnection(); ... Connection c2 = pool.getConnection(); The above code can cause deadlock because you can end up obtaining one connection and then waiting forever for the second one (meanwhile other threads are also waiting forever for their second connections). > "com.mchange.v2.async.ThreadPoolAsynchronousRunner$PoolThread-#2" daemon > prio=6 tid=0x4dcda400 nid=0x6ac in Object.wait() [0x5068f000..0x5068fb94] > java.lang.Thread.State: TIMED_WAITING (on object monitor) All of these threads look like they are waiting to do work that hasn't yet been assigned. These threads look good to me. I'm not sure how you're using the Spring framework to do your SQL stuff for you, but you should check to see that you aren't obtaining a connection and then firing-off another query /without/ using that connection: that will put you into the situation above. - -chris -----BEGIN PGP SIGNATURE----- Version: GnuPG v1.4.10 (MingW32) Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/ iEYEARECAAYFAku/RuYACgkQ9CaO5/Lv0PB50ACgtyqlmAckBoQXd9Jhzu0d1osA rxEAoJM1JEuL/HHvaFOl0KF/OAwVnDUK =36uP -----END PGP SIGNATURE----- --------------------------------------------------------------------- To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org For additional commands, e-mail: users-h...@tomcat.apache.org