[ https://issues.apache.org/jira/browse/SPARK-25590?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16673689#comment-16673689 ]
Marcelo Vanzin commented on SPARK-25590: ---------------------------------------- {noformat} $ jar tf assembly/target/scala-2.11/jars/kubernetes-model-4.1.0.jar | grep log4j log4j.properties {noformat} > kubernetes-model-2.0.0.jar masks default Spark logging config > ------------------------------------------------------------- > > Key: SPARK-25590 > URL: https://issues.apache.org/jira/browse/SPARK-25590 > Project: Spark > Issue Type: Bug > Components: Kubernetes > Affects Versions: 2.4.0 > Reporter: Marcelo Vanzin > Priority: Major > > That jar file, which is packaged when the k8s profile is enabled, has a log4j > configuration embedded in it: > {noformat} > $ jar tf /path/to/kubernetes-model-2.0.0.jar | grep log4j > log4j.properties > {noformat} > What this causes is that Spark will always use that log4j configuration > instead of its own default (log4j-defaults.properties), unless the user > overrides it by somehow adding their own in the classpath before the > kubernetes one. > You can see that by running spark-shell. With the k8s jar in: > {noformat} > $ ./bin/spark-shell > ... > Setting default log level to "WARN" > {noformat} > Removing the k8s jar: > {noformat} > $ ./bin/spark-shell > ... > Using Spark's default log4j profile: > org/apache/spark/log4j-defaults.properties > Setting default log level to "WARN". > {noformat} > The proper fix would be for the k8s jar to not ship that file, and then just > upgrade the dependency in Spark, but if there's something easy we can do in > the meantime... -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org