That didn't work since "extraClassPath" flag was still appending the jars
at the end, so its still picking the slf4j jar provided by spark.
Although I found this flag: --conf "spark.executor.userClassPathFirst=true"
(http://spark.apache.org/docs/latest/configuration.html) and tried this:

➜  simspark git:(bulkrunner) ✗ spark-1.4.1-bin-hadoop2.4/bin/spark-submit
--class runner.SparkRunner --jars
"/.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar,/.m2/repository/ch/qos/logback/logback-core/1.1.2/logback-core-1.1.2.jar"
--conf "spark.executor.userClassPathFirst=true" --conf
"spark.driver.userClassPathFirst=true"
target/ds-tetris-simspark-0.1-SNAPSHOT-jar-with-dependencies.jar

But this led to another error: com.typesafe.config.ConfigException$Missing:
No configuration setting found for key 'akka.version'

Thanks,
-Utkarsh

On Mon, Aug 24, 2015 at 3:25 PM, Marcelo Vanzin <van...@cloudera.com> wrote:

> Hi Utkarsh,
>
> A quick look at slf4j's source shows it loads the first
> "StaticLoggerBinder" in your classpath. How are you adding the logback
> jar file to spark-submit?
>
> If you use "spark.driver.extraClassPath" and
> "spark.executor.extraClassPath" to add the jar, it should take
> precedence over the log4j binding embedded in the Spark assembly.
>
>
> On Mon, Aug 24, 2015 at 3:15 PM, Utkarsh Sengar <utkarsh2...@gmail.com>
> wrote:
> > Hi Marcelo,
> >
> > When I add this exclusion rule to my pom:
> >         <dependency>
> >             <groupId>org.apache.spark</groupId>
> >             <artifactId>spark-core_2.10</artifactId>
> >             <version>1.4.1</version>
> >             <exclusions>
> >                 <exclusion>
> >                     <groupId>org.slf4j</groupId>
> >                     <artifactId>slf4j-log4j12</artifactId>
> >                 </exclusion>
> >             </exclusions>
> >         </dependency>
> >
> > The SparkRunner class works fine (from IntelliJ) but when I build a jar
> and
> > submit it to spark-submit:
> >
> > I get this error:
> > Caused by: java.lang.ClassCastException:
> org.slf4j.impl.Log4jLoggerFactory
> > cannot be cast to ch.qos.logback.classic.LoggerContext
> >     at
> >
> com.opentable.logging.AssimilateForeignLogging.assimilate(AssimilateForeignLogging.java:68)
> >     at
> >
> com.opentable.logging.AssimilateForeignLoggingHook.automaticAssimilationHook(AssimilateForeignLoggingHook.java:28)
> >     at com.opentable.logging.Log.<clinit>(Log.java:31)
> >
> > Which is this here (our logging lib is open sourced):
> >
> https://github.com/opentable/otj-logging/blob/master/logging/src/main/java/com/opentable/logging/AssimilateForeignLogging.java#L68
> >
> > Thanks,
> > -Utkarsh
> >
> >
> >
> >
> > On Mon, Aug 24, 2015 at 3:04 PM, Marcelo Vanzin <van...@cloudera.com>
> wrote:
> >>
> >> Hi Utkarsh,
> >>
> >> Unfortunately that's not going to be easy. Since Spark bundles all
> >> dependent classes into a single fat jar file, to remove that
> >> dependency you'd need to modify Spark's assembly jar (potentially in
> >> all your nodes). Doing that per-job is even trickier, because you'd
> >> probably need some kind of script to inject the correct binding into
> >> Spark's classpath.
> >>
> >> That being said, that message is not an error, it's more of a noisy
> >> warning. I'd expect slf4j to use the first binding available - in your
> >> case, logback-classic. Is that not the case?
> >>
> >>
> >> On Mon, Aug 24, 2015 at 2:50 PM, Utkarsh Sengar <utkarsh2...@gmail.com>
> >> wrote:
> >> > Continuing this discussion:
> >> >
> >> >
> http://apache-spark-user-list.1001560.n3.nabble.com/same-log4j-slf4j-error-in-spark-9-1-td5592.html
> >> >
> >> > I am getting this error when I use logback-classic.
> >> >
> >> > SLF4J: Class path contains multiple SLF4J bindings.
> >> > SLF4J: Found binding in
> >> >
> >> >
> [jar:file:.m2/repository/ch/qos/logback/logback-classic/1.1.2/logback-classic-1.1.2.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> > SLF4J: Found binding in
> >> >
> >> >
> [jar:file:.m2/repository/org/slf4j/slf4j-log4j12/1.7.10/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
> >> >
> >> > I need to use logback-classic for my current project, so I am trying
> to
> >> > ignore "slf4j-log4j12" from spark:
> >> >         <dependency>
> >> >             <groupId>org.apache.spark</groupId>
> >> >             <artifactId>spark-core_2.10</artifactId>
> >> >             <version>1.4.1</version>
> >> >             <exclusions>
> >> >                 <exclusion>
> >> >                     <groupId>org.slf4j</groupId>
> >> >                     <artifactId>slf4j-log4j12</artifactId>
> >> >                 </exclusion>
> >> >             </exclusions>
> >> >         </dependency>
> >> >
> >> > Now, when I run my job from Intellij (which sets the classpath),
> things
> >> > work
> >> > perfectly.
> >> >
> >> > But when I run my job via spark-submit:
> >> > ~/spark-1.4.1-bin-hadoop2.4/bin/spark-submit --class
> runner.SparkRunner
> >> > spark-0.1-SNAPSHOT-jar-with-dependencies.jar
> >> > My job fails because spark-submit sets up the classpath and it re-adds
> >> > the
> >> > slf4j-log4j12.
> >> >
> >> > I am not adding spark jar to the uber-jar via the maven assembly
> plugin:
> >> >      <dependencySets>
> >> >         <dependencySet>
> >> >             ......
> >> >
>  <useTransitiveDependencies>false</useTransitiveDependencies>
> >> >             <excludes>
> >> >                 <exclude>org.apache.spark:spark-core_2.10</exclude>
> >> >             </excludes>
> >> >         </dependencySet>
> >> >     </dependencySets>
> >> >
> >> > So how can I exclude "slf4j-log4j12.jar" when I submit a job via
> >> > spark-submit (on a per job basis)?
> >> >
> >> > --
> >> > Thanks,
> >> > -Utkarsh
> >>
> >>
> >>
> >> --
> >> Marcelo
> >
> >
> >
> >
> > --
> > Thanks,
> > -Utkarsh
>
>
>
> --
> Marcelo
>



-- 
Thanks,
-Utkarsh

Reply via email to