[ 
https://issues.apache.org/jira/browse/SPARK-1952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14012040#comment-14012040
 ] 

Patrick Wendell commented on SPARK-1952:
----------------------------------------

So I think the issue here is simply that Spark depends on slf4j 1.7.X, pig 
depends on slf4j 1.6.X, and those aren't compatible. If you look it's 
complaining about the function signature of that log() method which changed 
between 1.6 and 1.7. Further compounding things, Pig uses commons logging, so 
it's logging things through (commons logging -> slf4j).

http://grepcode.com/file/repo1.maven.org/maven2/org.slf4j/slf4j-api/1.6.1/org/slf4j/spi/LocationAwareLogger.java#LocationAwareLogger.log%28org.slf4j.Marker%2Cjava.lang.String%2Cint%2Cjava.lang.String%2Cjava.lang.Object%5B%5D%2Cjava.lang.Throwable%29

http://grepcode.com/file/repo1.maven.org/maven2/org.slf4j/slf4j-api/1.7.5/org/slf4j/spi/LocationAwareLogger.java#LocationAwareLogger.log%28org.slf4j.Marker%2Cjava.lang.String%2Cint%2Cjava.lang.String%2Cjava.lang.Object%5B%5D%2Cjava.lang.Throwable%29

The Spark code actually doesn't use any new API's that aren't in slf4j 1.6, so 
I could see how this worked in 0.9.0.

I think the problem here is that Spark 1.0 is now pulling in jul-to-slf4j 1.7.X 
and that _does_ use newer API's in SLF4j 7. So I'd remove this from the Spark 
1.0 build and see if that works (we have an explicit dependency on that). 
Basically, try to produce a Spark asembly without "SLF4JLocationAwareLog.class".

I think that should work if I'm understanding this correctly.

> slf4j version conflicts with pig
> --------------------------------
>
>                 Key: SPARK-1952
>                 URL: https://issues.apache.org/jira/browse/SPARK-1952
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: pig 12.1 on Cloudera Hadoop, CDH3
>            Reporter: Ryan Compton
>              Labels: pig, slf4j
>
> Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when 
> they "register" a jar containing Spark. The error appears to be related to 
> org.slf4j.spi.LocationAwareLogger.log.
> {code}
> Caused by: java.lang.RuntimeException: Could not resolve error that
> occured when launching map reduce job: java.lang.NoSuchMethodError:
> org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
> at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
> {code}
> To reproduce: compile Spark via $ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt 
> assembly and register the resulting jar into a pig script. E.g.
> {code}
> REGISTER 
> /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
> data0 = LOAD 'data' USING PigStorage();
> ttt = LIMIT data0 10;
> DUMP ttt;
> {code}
> The Spark-1.0 jar includes some slf4j dependencies that were not present in 
> 0.9.1
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep 
> LocationAware
>   3259 Mon Mar 25 21:49:34 PDT 2013 
> org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
>    479 Fri Dec 13 16:44:40 PST 2013 
> parquet/org/slf4j/spi/LocationAwareLogger.class
> {code}
> vs.
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep 
> LocationAware
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to