Hi Jaggu, I saw the same error one one point using apache hadoop 1.2.1, spark 0.9. I resolved by changing the build.sbt for my app to take spark and hadoop as "provided" dependencies only, not included in my app's jar, and during launch ensure that spark's jar comes before the app jar in the classpath. The problem appears to be that this "asm" package is widely used and different packages use different versions with incompatible interfaces, and not always shaded.
Hope that helps, Bryn On Wed, Feb 12, 2014 at 4:33 AM, Jaggu <jagana...@gmail.com> wrote: > Hi Sean, > Sorry for the typo: > > Correct Versions > Spark : spark-0.9.0-incubating-bin-cdh4 > Hadoop : Hadoop 2.0.0-cdh4.2.1 (Installed Using Cloudera Manager) > > My Code is Available : > > https://gist.github.com/jaganadhg/8851886/raw/aa2379e91baa90be017ccd1eee4aaf8aff9459eb/WordCount > > SBT Build script : http://pastebin.com/0mnUvb9D > > Error : http://pastebin.com/C0Cbg8T2 > > Still I am struggling with the same error. > > Best regards > > Jagan > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark-stand-alone-application-java-lang-IncompatibleClassChangeError-tp1385p1434.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. >