Should be chmod 644 and also I wouldnt recommend running tomcat as root.

John Larsen



On Tue, May 12, 2020 at 9:28 AM Patrick Baldwin <pbald...@myersinfosys.com>
wrote:

> I've gotten passed an odd (to me, anyway) issue with one of our clients
> CentOS systems.
>
> When our webapp starts running, tomcat dies shortly thereafter with an
> OutOfMemoryError. This apparently just started a few days ago.
>
> System info:
>
> Tomcat Version: Apache Tomcat/7.0.76
>
> JVM version: 1.8.0_191-b12
>
> OS: CentOS Linux release 7.6.1810 (Core)
>
>
> This seemed to indicate that catalina.sh isn’t the place for environment
> variables on Tomcat 7 for Linux:
>
> https://forums.centos.org/viewtopic.php?t=54207
>
>
> Since there isn’t a setenv.sh in /usr/local/tomcat/bin, we create one:
>
> https://stackoverflow.com/questions/9480210/tomcat-7-setenv-sh-is-not-found
>
> 195$ ls -l /usr/local/tomcat/bin/setenv.sh
>
> -rwxrwxrwx. 1 root tomcat 110 May 11 12:56 /usr/local/tomcat/bin/setenv.sh
>
> 45$ cat /usr/local/tomcat/bin/setenv.sh
>
> export CATALINA_OPTS="-server -Xms2048m -Xmx2048m"
>
> export JAVA_OPTS="-XX:PermSize=256m -XX:MaxPermSize=2048m"
>
> 46$
>
>
> System memory before starting tomcat:
>
> 188$ free -h
>
>               total        used        free      shared  buff/cache
> available
>
> Mem:            11G        2.3G        2.2G        2.0G        7.1G
> 6.7G
>
> Swap:          8.0G        1.0G        7.0G
>
>
> Started tomcat,  with sudo service tomcat start
>
> Tomcat journal error:
>
>
> May 11 17:48:59 protrack server[7298]: SEVERE: Unexpected death of
> background thread ContainerBackgroundProcessor[StandardEngine[Catalina]]
>
> May 11 17:48:59 protrack server[7298]: java.lang.OutOfMemoryError: GC
> overhead limit exceeded
>
> May 11 17:48:59 protrack server[7298]: Exception in thread
> "ContainerBackgroundProcessor[StandardEngine[Catalina]]"
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>
> May 11 17:49:38 protrack server[7298]: Exception:
> java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in
> thread "http-bio-8080-AsyncTimeout"
>
> May 11 17:49:39 protrack server[7298]: Exception:
> java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in
> thread "ajp-bio-8009-AsyncTimeout"
>
> May 11 17:49:42 protrack server[7298]: Exception in thread
>
> "org.springframework.scheduling.quartz.SchedulerFactoryBean#0_QuartzSchedulerThread"
>
>
> Application log error:
>
> Caused by: java.lang.OutOfMemoryError: GC overhead limit exceeded
>
> 2020-05-11 17:49:50
> [org.springframework.scheduling.quartz.SchedulerFactoryBean#0_Worker-2]
> ERROR o.s.t.i.TransactionInterceptor - Application exception overridden by
> rollback exception
>
> java.lang.OutOfMemoryError: GC overhead limit exceeded
>
>
> System memory while tomcat is up, after the OutOfMemoryError pops:
>
> ksmq_tv 191$ free -h
>
>               total        used        free      shared  buff/cache
> available
>
> Mem:            11G        3.5G        1.0G        2.0G        7.1G
> 5.5G
>
> Swap:          8.0G        1.0G        7.0G
>
>
> Stopped with  sudo service tomcat stop
>
>
>
> System memory after tomcat stopped:
>
> ksmq_tv 194$ free -h
>
>               total        used        free      shared  buff/cache
> available
>
> Mem:            11G        795M        3.7G        2.0G        7.1G
> 8.2G
>
> Swap:          8.0G        1.0G        7.0G
>
>
>
> It sure doesn't look like something is actually running the system out of
> memory at a system level; usage is definitely impacted by starting our app,
> but that's expected.
>
> Assuming no one finds any obvious errors with how we implemented setenv.sh,
> is there some way to verify what memory limitations tomcat is actually
> running under?
>
> I was also wondering if anyone knew an open source webapp that would be
> good to deploy to see if this problem is tomcat specific or an issue with
> our webapp?  I figure if I deploy something else that doesn't promptly
> throw an  OutOfMemoryError, then it might be more of a dev issue and less
> of a tomcat config issue.  Trying to at least figure out what direction I
> need to be looking in, any help much appreciated.
>

Reply via email to