[ 
https://issues.apache.org/jira/browse/SPARK-1952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14016045#comment-14016045
 ] 

Ryan Compton commented on SPARK-1952:
-------------------------------------

An additional note for anyone else who runs into this. Older versions of Hadoop 
(in my case CDH3) use an older version of slf4j:

{code}
rfcompton@node19 /u/l/hadoop-0.20> cat ivy/libraries.properties | grep -i slf4j
slf4j-api.version=1.4.3
slf4j-log4j12.version=1.4.3
{code}

Trying to run a standard mapreduce job via "hadoop jar" still throws the 
"java.lang.NoSuchMethodError: org.slf4j.spi.LocationAwareLogger.log" if 
Spark-1.0.0 is bundled inside your jar.

If you have $HADOOP_HOME set, Pig will launch pig-withouthadoop.jar which 
imports the version of Hadoop in $HADOOP_HOME and throw the error from there.

I think the best way to avoid all this is to not use bleeding-edge versions of 
Spark with two-year-old versions of Hadoop.



> slf4j version conflicts with pig
> --------------------------------
>
>                 Key: SPARK-1952
>                 URL: https://issues.apache.org/jira/browse/SPARK-1952
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: pig 12.1 on Cloudera Hadoop, CDH3
>            Reporter: Ryan Compton
>              Labels: pig, slf4j
>
> Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when 
> they "register" a jar containing Spark. The error appears to be related to 
> org.slf4j.spi.LocationAwareLogger.log.
> {code}
> Caused by: java.lang.RuntimeException: Could not resolve error that
> occured when launching map reduce job: java.lang.NoSuchMethodError:
> org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
> at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
> {code}
> To reproduce: compile Spark via $ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt 
> assembly and register the resulting jar into a pig script. E.g.
> {code}
> REGISTER 
> /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
> data0 = LOAD 'data' USING PigStorage();
> ttt = LIMIT data0 10;
> DUMP ttt;
> {code}
> The Spark-1.0 jar includes some slf4j dependencies that were not present in 
> 0.9.1
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep 
> LocationAware
>   3259 Mon Mar 25 21:49:34 PDT 2013 
> org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
>    479 Fri Dec 13 16:44:40 PST 2013 
> parquet/org/slf4j/spi/LocationAwareLogger.class
> {code}
> vs.
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep 
> LocationAware
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to