Andrew,

Thanks for replying.  I did the following and the result was still the same.

1. Added "spark.home /root/spark-1.0.0" to local conf/spark-defaults.conf,
where "/root...." was the place in the cluster where I put Spark.

2. Ran "bin/spark-shell --master
spark://sjc1-eng-float01.carrieriq.com:7077".

3. Sighed when I still saw the same error:

14/07/10 18:26:53 INFO AppClient$ClientActor: Executor updated:
app-20140711012651-0007/5 is now FAILED (class java.io.IOException: Cannot
run program "/Users/cwang/spark/bin/compute-classpath.sh" (in directory
"."): error=2, No such file or directory)

/Users/cwang/spark was my local SPARK_HOME, which is wrong.

What did I do wrong?  How do I know if the config file is taken?

I am novice to Spark so spare with me.

 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/executor-failed-cannot-find-compute-classpath-sh-tp859p9378.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to