1) You either need to modify the log4j-console.properties file, or explicitly set the log4j.configurationFile property to point to your .xml file.
2)
Have you made modifications to the distribution (e.g., removing other logging jars from the lib directory)?
Are you using application mode, or session clusters?

On 15/02/2022 16:41, jonas eyob wrote:
Hey,

We are deploying our Flink Cluster on a standalone Kubernetes with the longrunning job written in scala.

We recently upgraded our Flink cluster from 1.12 to 1.14.3 - after which we started seeing a few problems related to logging which I have been struggling to fix for the past days). Related is also an attempt to add, we are also attempting to add a Sentry integration for our error logs.

PROBLEM 1 - Error logs not being sent to Sentry.
We are bundling our code and dependencies into a FAT jar, which includes a log4j2.xml specifying the Sentry Appender. But if I understand the documentation <https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/deployment/advanced/logging/#configuring-log4j-2> correctly our log4j2.xml won't be picked up by Flink as it already defines a set of default logging configurations files (e.g. log4j and logback).

Q: How does Flink resolve logging configurations to use?

I can see the following JVM override params provided when running in our dockerized version locally.

-Dlog.file=/opt/flink/log/flink--taskexecutor-0-thoros-taskmanager-6b9785d4df-c28n4.log
2022-02-15 10:01:59,826 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlog4j.configuration=file:/opt/flink/conf/log4j-console.properties 2022-02-15 10:01:59,827 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlog4j.configurationFile=file:/opt/flink/conf/log4j-console.properties 2022-02-15 10:01:59,830 INFO org.apache.flink.runtime.taskexecutor.TaskManagerRunner [] - -Dlogback.configurationFile=file:/opt/flink/conf/logback-console.xml

Content of the log4j2.xml (path: src/main/resources):
<?xml version="1.0" encoding="UTF-8"?> <Configuration status="warn" packages="org.apache.logging.log4j.core,io.sentry.log4j2"> <Appenders> <Console name="Console" target="SYSTEM_OUT"> <PatternLayout pattern="%d{HH:mm:ss.SSS} [%t] %-5level %logger{36} - %msg%n"/> </Console> <Sentry name="Sentry" dsn="<DSN>" minimumEventLevel="ERROR" /> </Appenders> <Loggers> <Root level="info"> <AppenderRef ref="Sentry"/> <AppenderRef ref="Console"/> </Root> </Loggers> </Configuration>

For our kubernetes deployment we have followed the reference example here https://nightlies.apache.org/flink/flink-docs-master/docs/deployment/resource-providers/standalone/kubernetes/#common-cluster-resource-definitions. My assumption is that I would need to also provide the Sentry-related configuration to the "log4-console.properties" for it to be picked up by the Taskmanager and JobManager?

PROBLEM 2:
ERROR StatusLogger Log4j2 could not find a logging implementation.
Please add log4j-core to the classpath. Using SimpleLogger to log to the console

I am not sure what's going on here. Following dependencies are bundled with the FAT jar
"com.typesafe.scala-logging" %%"scala-logging" % scalaLoggingVersion, "org.slf4j" %"slf4j-api" %"1.7.33", "org.apache.logging.log4j" %"log4j-slf4j-impl" %"2.17.0", 
"org.apache.logging.log4j" %"log4j-core" %"2.17.0", "org.apache.logging.log4j" %%"log4j-api-scala" %"12.0", "io.sentry" %"sentry-log4j2" %"5.6.0",
Confused about what is going on here, possible this might not be Flink related matter but I am not sure..any tips on how to best debug this would be much appreciated.
--
*Thanks,*
*Jonas*

Reply via email to