Continuing this discussion:
http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html
I am getting this error when I use logback-classic.
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
I need to use logback-classic for my current project, so I am trying to
ignore "slf4j-log4j12" from spark:
<dependency>
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_2.10</artifactId>
<version>1.4.1</version>
<exclusions>
<exclusion>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-log4j12</artifactId>
</exclusion>
</exclusions>
</dependency>
Now, when I run my job from Intellij (which sets the classpath), things
work perfectly.
But when I run my job via spark-submit:
~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class runner.SparkRunner
spark-0.1-SNAPSHOT-jar-with-dependencies.jar
My job fails because spark-submit sets up the classpath and it re-adds the
slf4j-log4j12.
I am not adding spark jar to the uber-jar via the maven assembly plugin:
<dependencySets>
<dependencySet>
......
<useTransitiveDependencies>false</useTransitiveDependencies>
<excludes>
<exclude>org.apache.spark:spark-core_2.10</exclude>
</excludes>
</dependencySet>
</dependencySets>
So how can I exclude "slf4j-log4j12.jar" when I submit a job via
spark-submit (on a per job basis)?
--
Thanks,
-Utkarsh