Hi all:I deployed a spark client in my own machine. I put SPARK in path:` 
/home/somebody/spark`, and the cluster's worker spark home path is 
`/home/spark/spark` .While I launched the jar, it shows that: ` 
AppClient$ClientActor: Executor updated: app-20141124170955-11088/12 is now 
FAILED (java.io.IOException: Cannot run program 
"/home/somebody/proc/spark_client/spark/bin/compute-classpath.sh" (in directory 
"."): error=2, No such file or directory)`. 
The worker should run /home/spark/spark/bin/compute-classpath.sh but not the 
client's compute-classpath.sh.  It appears to be that I set some environment 
variables with the client path, but in fact, there is no spark-env.sh or 
spark-default.conf associated  with my client spark path.Is there any hint? 
Thanks.                                    

Reply via email to