I use both Pig and Spark. All my code is built with Maven into a giant
*-jar-with-dependencies.jar. I recently upgraded to Spark 1.0 and now
all my pig scripts fail with:

Caused by: java.lang.RuntimeException: Could not resolve error that
occured when launching map reduce job: java.lang.NoSuchMethodError:
org.slf4j.spi.LocationAwareLogger.log(Lorg/slf4j/Marker;Ljava/lang/String;ILjava/lang/String;[Ljava/lang/Object;Ljava/lang/Throwable;)V
at 
org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher$JobControlThreadExceptionHandler.uncaughtException(MapReduceLauncher.java:598)
at java.lang.Thread.dispatchUncaughtException(Thread.java:1874)


Did Spark 1.0 change the version of slf4j? I can't seem to find it via
mvn dependency:tree

Reply via email to