After building Spark and on trying to execute the SimpleApp.java from the
overview tutorial, and making changes to read a HDFS file -- i get the below
error while execution

Caused by: java.lang.VerifyError: class
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$SetOwnerRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet;

[ERROR] Failed to execute goal
org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project
simple-project: An exception occured while executing the Java class. null:
InvocationTargetException: class
org.apache.hadoop.hdfs.protocol.proto.ClientNamenodeProtocolProtos$AddBlockRequestProto
overrides final method
getUnknownFields.()Lcom/google/protobuf/UnknownFieldSet; -> [Help 1]
org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute
goal org.codehaus.mojo:exec-maven-plugin:1.2.1:java (default-cli) on project
simple-project: An exception occured while executing the Java class. null at
org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:217)

My environment is
Hadoop ver 2.2.0/Ubuntu 12.0.4 Spark 0.9.0 ...Using Scala version 2.10.3
(OpenJDK Server VM, Java 1.7.0_51)

i built the target with the specifying the correct hadoop verson.

Any help
Thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Unable-to-read-HDFS-file-SimpleApp-java-tp1813.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to