[ 
https://issues.apache.org/jira/browse/SPARK-1952?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14011971#comment-14011971
 ] 

Ryan Compton commented on SPARK-1952:
-------------------------------------

No luck.

I modified project/SparkBuild.scala
{code}
[rfcompton@node19 spark-1.0.0]$ cat project/SparkBuild.scala | grep slf
  val slf4jVersion = "1.7.2"
  val excludeSLF4J = ExclusionRule(organization = "org.slf4j")
        "org.slf4j"                  % "slf4j-api"        % slf4jVersion,
        "org.slf4j"                  % "slf4j-log4j12"    % slf4jVersion,
        "org.slf4j"                  % "jul-to-slf4j"     % slf4jVersion,
        "org.slf4j"                  % "jcl-over-slf4j"   % slf4jVersion,
        "org.spark-project.akka"    %% "akka-slf4j"       % akkaVersion,
      "com.typesafe" %% "scalalogging-slf4j" % "1.0.1"
{code}

But when I register the 1.0 jar I still get the same stack trace:
{code}
Pig Stack Trace
---------------
ERROR 2998: Unhandled internal error. 
org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V

java.lang.NoSuchMethodError: 
org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
        at 
org.apache.commons.logging.impl.SLF4JLocationAwareLog.debug(SLF4JLocationAwareLog.java:133)
        at 
org.apache.pig.parser.QueryParserDriver.expandMacro(QueryParserDriver.java:264)
        at 
org.apache.pig.parser.QueryParserDriver.parse(QueryParserDriver.java:180)
        at org.apache.pig.PigServer$Graph.parseQuery(PigServer.java:1678)
        at org.apache.pig.PigServer$Graph.access$000(PigServer.java:1411)
        at org.apache.pig.PigServer.parseAndBuild(PigServer.java:344)
        at org.apache.pig.PigServer.executeBatch(PigServer.java:369)
        at org.apache.pig.PigServer.executeBatch(PigServer.java:355)
        at 
org.apache.pig.tools.grunt.GruntParser.executeBatch(GruntParser.java:140)
        at 
org.apache.pig.tools.grunt.GruntParser.processDump(GruntParser.java:769)
        at 
org.apache.pig.tools.pigscript.parser.PigScriptParser.parse(PigScriptParser.java:372)
        at 
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:198)
        at 
org.apache.pig.tools.grunt.GruntParser.parseStopOnError(GruntParser.java:173)
        at org.apache.pig.tools.grunt.Grunt.exec(Grunt.java:84)
        at org.apache.pig.Main.run(Main.java:607)
        at org.apache.pig.Main.main(Main.java:156)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.util.RunJar.main(RunJar.java:197)
================================================================================
{code}

This does not happen if I register 0.9.1
{code}
REGISTER 
/usr/share/osi1/spark-0.9.1-bin-hadoop1/assembly/target/scala-2.10/spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar
{code}

Recompiling pig didn't do it either.

Is it possible to build Spark 1.0 for Hadoop version cdh3u3? 




> slf4j version conflicts with pig
> --------------------------------
>
>                 Key: SPARK-1952
>                 URL: https://issues.apache.org/jira/browse/SPARK-1952
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>         Environment: pig 12.1 on Cloudera Hadoop, CDH3
>            Reporter: Ryan Compton
>              Labels: pig, slf4j
>
> Upgrading from Spark-0.9.1 to Spark-1.0.0 causes all Pig scripts to fail when 
> they "register" a jar containing Spark. The error appears to be related to 
> org.slf4j.spi.LocationAwareLogger.log.
> {code}
> Caused by: java.lang.RuntimeException: Could not resolve error that
> occured when launching map reduce job: java.lang.NoSuchMethodError:
> org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
> at 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
> at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)
> {code}
> To reproduce: compile Spark via $ SPARK_HADOOP_VERSION=0.20.2-cdh3u4 sbt/sbt 
> assembly and register the resulting jar into a pig script. E.g.
> {code}
> REGISTER 
> /usr/share/spark-1.0.0/assembly/target/scala-2.10/spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar;
> data0 = LOAD 'data' USING PigStorage();
> ttt = LIMIT data0 10;
> DUMP ttt;
> {code}
> The Spark-1.0 jar includes some slf4j dependencies that were not present in 
> 0.9.1
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-1.0.0-SNAPSHOT-hadoop0.20.2-cdh3u4.jar | grep -i "slf" | grep 
> LocationAware
>   3259 Mon Mar 25 21:49:34 PDT 2013 
> org/apache/commons/logging/impl/SLF4JLocationAwareLog.class
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
>    479 Fri Dec 13 16:44:40 PST 2013 
> parquet/org/slf4j/spi/LocationAwareLogger.class
> {code}
> vs.
> {code}
> rfcompton@node19 /u/s/o/s/a/t/scala-2.10> jar tvf 
> spark-assembly-0.9.1-hadoop0.20.2-cdh3u3.jar | grep -i "slf" | grep 
> LocationAware
>    455 Mon Mar 25 21:49:22 PDT 2013 org/slf4j/spi/LocationAwareLogger.class
> {code}



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to