I had this problem also with spark 1.1.1.  At the time I was using hadoop
0.20.

To get around it I installed hadoop 2.5.2, and set the protobuf.version to
2.5.0 in the build command like so:
    mvn -Phadoop-2.5 -Dhadoop.version=2.5.2 -Dprotobuf.version=2.5.0
-DskipTests clean package

So I changed spark's pom.xml to read the protobuf.version from the command
line.
If I didn't explicitly set protobuf.version it was picking up an older
version that existed on my filesystem somewhere,

Karen



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-1-1-1-Hadoop-2-6-Protobuf-conflict-tp20656p20658.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to