Hi:

I tried to build Spark (1.1.0) with hadoop 2.4.0, and ran a simple
wordcount example using spark_shell on mesos. When I ran my application, I
got following error that looks related to the mismatch of protobuf versions
between hadoop cluster (protobuf 2.5) and spark (protobuf 4.1). I ran "mvn
dependency:tree -Dincludes=*protobuf*", and found that zkka pulled in this
protobuf 4.1 Have anyone seen this problem before ? Thanks.

Error when running spark on hadoop 2.4.0
*java.lang.VerifyError: class
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AppendRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;*

> mvn dependency:tree -Dincludes=*protobuf*
...




*[INFO] --- maven-dependency-plugin:2.8:tree (default-cli) @
spark-core_2.10 ---[INFO] org.apache.spark:spark-core_2.10:jar:1.1.0[INFO]
\-
org.spark-project.akka:akka-remote_2.10:jar:2.2.3-shaded-protobuf:compile[INFO]
\- org.spark-project.protobuf:protobuf-java:jar:2.4.1-shaded:compile*

Reply via email to