Re: Hadoop interface vs class

2014-06-26 Thread Sean Owen
You seem to have the binary for Hadoop 2, since it was compiled expecting that TaskAttemptContext is an interface. So the error indicates that Spark is also seeing Hadoop 1 classes somewhere. On Wed, Jun 25, 2014 at 4:41 PM, Robert James srobertja...@gmail.com wrote: After upgrading to Spark

Re: Hadoop interface vs class

2014-06-26 Thread Sean Owen
Yes it does. The idea is to override the dependency if needed. I thought you mentioned that you had built for Hadoop 2. On Jun 26, 2014 11:07 AM, Robert James srobertja...@gmail.com wrote: Yes. As far as I can tell, Spark seems to be including Hadoop 1 via its transitive dependency:

Re: Hadoop interface vs class

2014-06-26 Thread Robert James
On 6/26/14, Sean Owen so...@cloudera.com wrote: Yes it does. The idea is to override the dependency if needed. I thought you mentioned that you had built for Hadoop 2. I'm very confused :-( I downloaded the Spark distro for Hadoop 2, and installed it on my machine. But the code doesn't have a

Re: Hadoop interface vs class

2014-06-26 Thread Sean Owen
On Thu, Jun 26, 2014 at 1:44 PM, Robert James srobertja...@gmail.com wrote: I downloaded the Spark distro for Hadoop 2, and installed it on my machine. But the code doesn't have a reference to that path - it uses sbt for dependencies. As far as I can tell, using sbt or maven or ivy will

Hadoop interface vs class

2014-06-25 Thread Robert James
After upgrading to Spark 1.0.0, I get this error: ERROR org.apache.spark.executor.ExecutorUncaughtExceptionHandler - Uncaught exception in thread Thread[Executor task launch worker-2,5,main] java.lang.IncompatibleClassChangeError: Found interface org.apache.hadoop.mapreduce.TaskAttemptContext,