I assume you mean Spark 0.9 (not 9.0) and CDH 4.2 (there is no Hadoop 4.2).

"IncompatibleClassChangeError" usually is due to Hadoop version
mismatches. CDH 4.0+ is based on Hadoop 2 and so you should build
Spark for Hadoop 2 with -Dhadoop.version=2.2.0, which should be OK.

But. Your class change error is not the usual one, so I am not sure
this is actually the issue (although it might also be an issue). It's:

Exception in thread "main" java.lang.IncompatibleClassChangeError:
class org.apache.spark.util.InnerClosureFinder has interface
org.objectweb.asm.ClassVisitor as super class

This sounds like some old and new Spark got mixed up somewhere perhaps?

--
Sean Owen | Director, Data Science | London


On Tue, Feb 11, 2014 at 2:04 PM, JAGANADH G <jagana...@gmail.com> wrote:
> Hi Team,
>
> I was trying to execute a Spark stand-alone application (WordCount) in a
> cluster with two workers.
> Spark version is 9.0
> Hadoop Version 4.2
> When I executed the job I got the following error
> Refer the pastebin @ http://tny.cz/d78ed2da
>
> Any clue how to resolve the same
>
> Best regards
>
> Jaggu
>
> --
> **********************************
> JAGANADH G
> http://jaganadhg.in
> ILUGCBE
> http://ilugcbe.org.in

Reply via email to