Hi C.J.,

The PR Yana pointed out seems to fix this. However, it is not merged in
master yet, so for now I would recommend that you try the following
workaround: set "spark.home" to the executor's /path/to/spark. I provided
more detail here:
http://mail-archives.apache.org/mod_mbox/spark-user/201407.mbox/%3cCAMJOb8mYTzxrHWcaDOnVoOTw1TFrd9kJjOyj1=nkgmsk5vs...@mail.gmail.com%3e

Andrew


2014-07-10 1:57 GMT-07:00 cjwang <c...@cjwang.us>:

> Not sure that was what I want.  I tried to run Spark Shell on a machine
> other
> than the master and got the same error.  The "192" was suppose to be a
> simple shell script change that alters SPARK_HOME before submitting jobs.
> Too bad it wasn't there anymore.
>
> The build described in the pull request (16440) seemed failed.  So, I can't
> use it.
>
> I am looking for those shell script changes.
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/executor-failed-cannot-find-compute-classpath-sh-tp859p9277.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>

Reply via email to