https://bz.apache.org/bugzilla/show_bug.cgi?id=62958
Bug ID: 62958
Summary: too many open files if server sits idle
Product: Tomcat 9
Version: 9.0.13
Hardware: PC
OS: Linux
Status: NEW
Severity: normal
Priority: P2
Component: Catalina
Assignee: [email protected]
Reporter: [email protected]
Target Milestone: -----
We are suspecting the 9.0.13 upgrade introduces an open file leak.
Steps to display the problem:
1. Download apache-tomcat-9.0.13.tar.gz
2. gunzip apache-tomcat-9.0.13.tar.gz
3. tar -xf apache-tomcat-9.0.13.tar
4. cd apache-tomcat-9.0.13/bin
5. ./startup.sh
6. ps -ef | grep tomcat
7. Grab the PID# from ps output
8. Execute the following command every 10 seconds or so and watch the
number of open files rise.
a. ls -l /proc/<PID#>/fd | grep -i “tomcat-users”
Actual commands/output:
[root@wso2-as-001 bin]# ls -l /proc/1973/fd | grep -i "tomcat-users"
lr-x------. 1 root root 64 Nov 27 11:13 71 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
lr-x------. 1 root root 64 Nov 27 11:13 72 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
lr-x------. 1 root root 64 Nov 27 11:13 73 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
[root@wso2-as-001 bin]# ls -l /proc/1973/fd | grep -i "tomcat-users"
lr-x------. 1 root root 64 Nov 27 11:13 71 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
lr-x------. 1 root root 64 Nov 27 11:13 72 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
lr-x------. 1 root root 64 Nov 27 11:13 73 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
lr-x------. 1 root root 64 Nov 27 11:13 74 ->
/home/[email protected]/apache-tomcat-9.0.13/conf/tomcat-users.xml
Similar steps can be followed with apache-tomcat-9.0.12.tar.gz to show that
this did not previously happen.
This caused our tomcat to essentially “hang” as it idled over night. While
idling, our number of open files surpassed our previously configured limit of
4096.
Note that we only notice this issue while the server remains idle. The
deployment/undeployment of applications clean these open files up, then they
start rebuilding again.
Also note that we have implemented a work around by altering our unit file to
contain configurations to set the limit of the process to 65535 open files.
System information:
[root@wso2-as-001 bin]# cat /etc/centos-release
CentOS Linux release 7.5.1804 (Core)
[root@wso2-as-001 bin]# uname -a
Linux wso2-as-001 3.10.0-862.14.4.el7.x86_64 #1 SMP Wed Sep 26 15:12:11 UTC
2018
x86_64 x86_64 x86_64 GNU/Linux
[root@wso2-as-001 bin]# java -version
java version "1.8.0_191"
Java(TM) SE Runtime Environment (build 1.8.0_191-b12)
Java HotSpot(TM) 64-Bit Server VM (build 25.191-b12, mixed mode)
Process limit information:
[root@wso2-as-001 bin]# cat /proc/1973/limits
Limit Soft Limit Hard Limit Units
Max cpu time unlimited unlimited seconds
Max file size unlimited unlimited bytes
Max data size unlimited unlimited bytes
Max stack size 8388608 unlimited bytes
Max core file size 0 unlimited bytes
Max resident set unlimited unlimited bytes
Max processes 7262 7262 processes
Max open files 4096 4096 files
Max locked memory 65536 65536 bytes
Max address space unlimited unlimited bytes
Max file locks unlimited unlimited locks
Max pending signals 7262 7262 signals
Max msgqueue size 819200 819200 bytes
Max nice priority 0 0
Max realtime priority 0 0
Max realtime timeout unlimited unlimited us
--
You are receiving this mail because:
You are the assignee for the bug.
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]