Given that the error is
java.lang.IncompatibleClassChangeError: Found interface
org.apache.hadoop.mapreduce.TaskAttemptContext, but class was expected
...this usually means there is a Hadoop version problem.
But in particular it's
https://issues.apache.org/jira/browse/SPARK-3039 which affects
as
Hi All,
I saw some helps online about forcing avro-mapred to hadoop2 using
classifiers.
Now my configuration is thus
val avro= "org.apache.avro" % "avro-mapred" % V.avro classifier
"hadoop2"
How ever I still get java.lang.IncompatibleClassChangeError. I think I am
not building sp
Hi Experts,
I have recently installed HDP2.2(Depends on hadoop 2.6).
My spark 1.2 is built with hadoop 2.3 profile.
/( mvn -Pyarn -Dhadoop.version=2.6.0 -Dyarn.version=2.6.0 -Phadoop-2.3
-Phive -DskipTests clean package)/
My program has following dependencies
/val avro= "org.apach