I think the root cause is that we are using "flink-console.sh" to start the
JobManager/TaskManager process for native K8s integration after
FLINK-21128[1].
So it forces the log4j configuration name to be "log4j-console.properties".


[1]. https://issues.apache.org/jira/browse/FLINK-21128


Best,
Yang

Tamir Sagi <tamir.s...@niceactimize.com> 于2022年1月13日周四 20:30写道:

> Hey All
>
> I'm Running Flink 1.14.2, it seems like it ignores system
> property -Dlog4j.configurationFile and
> falls back to /opt/flink/conf/log4j-console.properties
>
> I enabled debug log for log4j2  ( -Dlog4j2.debug)
>
> DEBUG StatusLogger Catching
>  java.io.FileNotFoundException:
> file:/opt/flink/conf/log4j-console.properties (No such file or directory)
> at java.base/java.io.FileInputStream.open0(Native Method)
> at java.base/java.io.FileInputStream.open(Unknown Source)
> at java.base/java.io.FileInputStream.<init>(Unknown Source)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory.getInputFromString(ConfigurationFactory.java:370)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:513)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:499)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory$Factory.getConfiguration(ConfigurationFactory.java:422)
> at
> org.apache.logging.log4j.core.config.ConfigurationFactory.getConfiguration(ConfigurationFactory.java:322)
> at
> org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:695)
> at
> org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:716)
> at
> org.apache.logging.log4j.core.LoggerContext.start(LoggerContext.java:270)
> at
> org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:155)
> at
> org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:47)
> at org.apache.logging.log4j.LogManager.getContext(LogManager.java:196)
> at
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getContext(AbstractLoggerAdapter.java:137)
> at
> org.apache.logging.slf4j.Log4jLoggerFactory.getContext(Log4jLoggerFactory.java:55)
> at
> org.apache.logging.log4j.spi.AbstractLoggerAdapter.getLogger(AbstractLoggerAdapter.java:47)
> at
> org.apache.logging.slf4j.Log4jLoggerFactory.getLogger(Log4jLoggerFactory.java:33)
> at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:329)
> at org.slf4j.LoggerFactory.getLogger(LoggerFactory.java:349)
> at
> org.apache.flink.runtime.rpc.akka.AkkaRpcServiceUtils.<clinit>(AkkaRpcServiceUtils.java:55)
> at
> org.apache.flink.runtime.rpc.akka.AkkaRpcSystem.remoteServiceBuilder(AkkaRpcSystem.java:42)
> at
> org.apache.flink.runtime.rpc.akka.CleanupOnCloseRpcSystem.remoteServiceBuilder(CleanupOnCloseRpcSystem.java:77)
> at
> org.apache.flink.runtime.rpc.RpcUtils.createRemoteRpcService(RpcUtils.java:184)
> at
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.initializeServices(ClusterEntrypoint.java:300)
> at
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runCluster(ClusterEntrypoint.java:243)
> at
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.lambda$startCluster$1(ClusterEntrypoint.java:193)
> at
> org.apache.flink.runtime.security.contexts.NoOpSecurityContext.runSecured(NoOpSecurityContext.java:28)
> at
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.startCluster(ClusterEntrypoint.java:190)
> at
> org.apache.flink.runtime.entrypoint.ClusterEntrypoint.runClusterEntrypoint(ClusterEntrypoint.java:617)
>
> Where I see the property is being loaded while deploying the cluster
>
> source:{
> class:org.apache.flink.configuration.GlobalConfiguration
> method:loadYAMLResource
> file:GlobalConfiguration.java
> line:213
> }
> message:Loading configuration property: env.java.opts,
> -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/dumps
> -Dlog4j.configurationFile=/opt/log4j2/log4j2.xml -Dlog4j2.debug=true
>
> in addition,  following the documentation[1], it seems like Flink comes
> with default log4j properties files located in /opt/flink/conf
>
> looking into that dir once the cluster is deployed, only flink-conf.yaml
> is there.
>
>
>
> Docker file content
>
> FROM flink:1.14.2-scala_2.12-java11
> ARG JAR_FILE
> COPY target/${JAR_FILE} $FLINK_HOME/usrlib/flink-job.jar
> ADD log4j2.xml /opt/log4j2/log4j2.xml
>
>
>
> *It perfectly works in 1.12.2 with the same log4j2.xml file and system
> property. *
>
> [1]
> https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/advanced/logging/#configuring-log4j-2
>
>
> Best,
> Tamir
>
>
>
> Confidentiality: This communication and any attachments are intended for
> the above-named persons only and may be confidential and/or legally
> privileged. Any opinions expressed in this communication are not
> necessarily those of NICE Actimize. If this communication has come to you
> in error you must take no action based on it, nor must you copy or show it
> to anyone; please delete/destroy and inform the sender by e-mail
> immediately.
> Monitoring: NICE Actimize may monitor incoming and outgoing e-mails.
> Viruses: Although we have taken steps toward ensuring that this e-mail and
> attachments are free from any virus, we advise that in keeping with good
> computing practice the recipient should ensure they are actually virus free.
>

Reply via email to