Hello Mark, Firstly sorry for my poor details about my version of Spark. So in my cluster I have CDH 4.5 (Hadoop 2.0.0) installed and now finally I installed succesfully the latest release from Spark : spark-0.9.0-incubating <http://d3kbcqa49mib13.cloudfront.net/spark-0.9.0-incubating.tgz> .
You are right I must to build locally the spark, but on each node from the cluster with my exactly Hadoop version: SPARK_HADOOP_VERSION=2.0.0-cdh4.5.0 sbt/sbt assembly publish-local Thanks for your help, -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/What-I-am-missing-from-configuration-tp878p1387.html Sent from the Apache Spark User List mailing list archive at Nabble.com.