Am 20.04.2015 um 17:40 schrieb Thomas Boniface:
Hi,

Both nginx and tomcat are hosted on the same server when listing the
connections I see both the connections from nginx to tomcat (the first one
create) and the one from tomcat to nginx used to reply. I may have
presented things the bad way though (I'm not too good regarding system
level).

I do agree the high number of close wait seems strange, I really feel like
nginx closed the connection before tomcat did (what I think leads to the
broken pipe expections observed in the catalina.out). In case someone want
to have a look I uploaded a netstat log here:
http://www.filedropper.com/netsat

The connection statistics between clients and nginx http port is:

  Count           IP:Port ConnectionState
  45467  178.32.101.62:80 TIME_WAIT
  44745    178.33.42.6:80 TIME_WAIT
  26093    178.33.42.6:80 ESTABLISHED
  25667  178.32.101.62:80 ESTABLISHED
   6898  178.32.101.62:80 FIN_WAIT2
   6723    178.33.42.6:80 FIN_WAIT2
    800  178.32.101.62:80 FIN_WAIT1
    792    178.33.42.6:80 FIN_WAIT1
    712  178.32.101.62:80 LAST_ACK
    656    178.33.42.6:80 LAST_ACK
    234    178.33.42.6:80 SYN_RECV
    232  178.32.101.62:80 SYN_RECV
     18    178.33.42.6:80 CLOSING
      8  178.32.101.62:80 CLOSING
      1    178.33.42.6:80 CLOSE_WAIT
      1        0.0.0.0:80 LISTEN

So lots of connections in TIME_WAIT state which is kind of expected for a web server doing lots of short time client connections, but slows down the IP stack. Also quite a lot of established connections (about 50.000!), which means that you probably want to check, whether you can reduce your keep alive timeout for nginx.

The same statistics for the https port:

  Count           IP:Port ConnectionState
   2283 178.32.101.62:443 TIME_WAIT
   2125   178.33.42.6:443 TIME_WAIT
   1585 178.32.101.62:443 ESTABLISHED
   1493   178.33.42.6:443 ESTABLISHED
    484 178.32.101.62:443 FIN_WAIT2
    420   178.33.42.6:443 FIN_WAIT2
     47 178.32.101.62:443 FIN_WAIT1
     46   178.33.42.6:443 FIN_WAIT1
     25 178.32.101.62:443 LAST_ACK
     17   178.33.42.6:443 SYN_RECV
     16 178.32.101.62:443 SYN_RECV
     16   178.33.42.6:443 LAST_ACK
     10   178.33.42.6:443 CLOSING
      4 178.32.101.62:443 CLOSING
      1       0.0.0.0:443 LISTEN

About the same relative picture but only about 5% of the http connection counts.

The incoming connection statistics for Tomcat (port 8080) is:

  Count           IP:Port ConnectionState
   8381    127.0.0.1:8080 CLOSE_WAIT
   1650    127.0.0.1:8080 ESTABLISHED
    127    127.0.0.1:8080 SYN_RECV
     65    127.0.0.1:8080 TIME_WAIT
      1   172.16.1.3:8080 LISTEN
      1    127.0.0.1:8080 LISTEN

The many CLOSE_WAIT mean, that the remote side (nginx) has already closed the connection, but not yet Tomcat. Probably the idleconnection timeout / keep alive timeout for connectiosn between nginx and Tomcat is lower on the nginx side, than on the tomcat side.

Interestingly the same connections but viewed from the opposite side of the connection (nginx) have totally different statistics:

  Count           IP:Port ConnectionState
  20119    127.0.0.1:8080 SYN_SENT
   4692    127.0.0.1:8080 ESTABLISHED
    488    127.0.0.1:8080 FIN_WAIT2
    122    127.0.0.1:8080 TIME_WAIT
     13    127.0.0.1:8080 FIN_WAIT1

I wonder why we have 4692 established connections here, but on 1650 in the table above. In a static situation, the numbers should be the same. It indicates, that the is so much dynamics, taht the numbers vary a lot even while netstat runs.

We see a lot of SYN_SENT, so nginx wants to open many more connections to Tomcat but doesn't get them as quickly as it wants.

Finally there's a bunch of connections to remote web services:

  Count            IP:Port ConnectionState
    286      95.85.3.86:80 CLOSE_WAIT
    255   46.228.164.12:80 ESTABLISHED
    209   188.125.82.65:80 CLOSE_WAIT
    172  176.74.173.230:80 ESTABLISHED
    170   54.171.53.252:80 CLOSE_WAIT
    136   188.125.82.65:80 LAST_ACK
    129      95.85.3.86:80 LAST_ACK
    128  23.212.108.209:80 CLOSE_WAIT
    106  46.137.157.249:80 CLOSE_WAIT
    101    81.19.244.69:80 ESTABLISHED
     86   146.148.30.94:80 CLOSE_WAIT
     83    46.137.83.90:80 CLOSE_WAIT
     80   188.125.82.65:80 ESTABLISHED
     78  37.252.163.221:80 CLOSE_WAIT
     77    46.137.83.90:80 ESTABLISHED
     73  46.137.157.121:80 CLOSE_WAIT
     64    54.246.89.98:80 CLOSE_WAIT
     63  173.194.40.153:80 ESTABLISHED
     61    93.176.80.69:80 ESTABLISHED
     55  23.212.108.198:80 CLOSE_WAIT
     53    54.72.204.78:80 CLOSE_WAIT
     51  37.252.162.230:80 CLOSE_WAIT
     51  173.194.40.154:80 ESTABLISHED
     50  54.247.113.157:80 CLOSE_WAIT
     50   37.252.170.98:80 CLOSE_WAIT
     49  23.212.108.191:80 CLOSE_WAIT
     47   54.154.23.133:80 CLOSE_WAIT
     43  176.34.179.135:80 CLOSE_WAIT
     39   146.148.21.73:80 CLOSE_WAIT
     36   46.137.87.196:80 CLOSE_WAIT
     34  173.194.40.154:80 CLOSE_WAIT
     30   46.137.87.163:80 CLOSE_WAIT
     30    37.252.170.5:80 CLOSE_WAIT
     29  23.212.108.215:80 CLOSE_WAIT
     29   46.228.164.12:80 CLOSE_WAIT
     28    54.77.236.40:80 CLOSE_WAIT
     26  37.252.163.162:80 CLOSE_WAIT
     26  173.194.40.141:80 ESTABLISHED
     25   146.148.5.248:80 CLOSE_WAIT
     25    68.67.152.86:80 CLOSE_WAIT
     25    23.251.130.5:80 CLOSE_WAIT
     23  23.251.133.199:80 CLOSE_WAIT
     23   85.114.159.66:80 CLOSE_WAIT
     23   46.228.164.12:80 FIN_WAIT1
     21  192.158.31.169:80 CLOSE_WAIT
     20  23.212.108.193:80 CLOSE_WAIT
     19  37.252.170.106:80 CLOSE_WAIT
     19   146.148.10.88:80 CLOSE_WAIT
     18   85.114.159.66:80 ESTABLISHED
     15  23.251.141.112:80 CLOSE_WAIT
     14  173.194.40.141:80 CLOSE_WAIT
     13    54.246.89.98:80 ESTABLISHED
     12   46.228.164.12:80 LAST_ACK
      9  23.212.108.193:80 ESTABLISHED
      7  37.252.163.238:80 CLOSE_WAIT
      7  23.251.138.102:80 CLOSE_WAIT
      7  192.158.31.169:80 ESTABLISHED
      7  173.194.40.153:80 CLOSE_WAIT
      7  146.148.11.181:80 CLOSE_WAIT
      7      95.85.3.86:80 ESTABLISHED
      6 146.148.127.205:80 CLOSE_WAIT
      5   37.252.163.26:80 CLOSE_WAIT
      4 146.148.113.130:80 CLOSE_WAIT
      3  146.148.23.202:80 CLOSE_WAIT
      3  146.148.118.41:80 CLOSE_WAIT
      3   37.252.163.26:80 ESTABLISHED
      2    23.251.130.5:80 ESTABLISHED
      1  54.247.113.157:80 ESTABLISHED
      1  23.212.108.209:80 ESTABLISHED
      1  176.34.179.135:80 ESTABLISHED
      1  173.194.40.154:80 TIME_WAIT
      1   5.135.147.172:80 CLOSE_WAIT
      1   37.252.170.96:80 CLOSE_WAIT
      1   37.252.170.69:80 CLOSE_WAIT
      1   37.252.163.26:80 TIME_WAIT
      1    68.67.152.86:80 TIME_WAIT

Again many are in CLOSE_WAIT meaning here, that the remote web service has already closed the connection, but not the local client, which probably sits inside your webapp.

All in all the numbers are quite big in many aspects. You could try to grow your tomcat connector thread pool, but the question would be whether many of those connections are actually busy and whether your system (hardware) can cope with that load.

Regards,

Rainer

---------------------------------------------------------------------
To unsubscribe, e-mail: users-unsubscr...@tomcat.apache.org
For additional commands, e-mail: users-h...@tomcat.apache.org

Reply via email to