Hi,
has this issue been resolved? I am currently running into similar problems.
I am using spark-1.3.0-bin-hadoop2.4 on Windows and Ubuntu. I have setup all
path on my Windows machine in an identical manner as on my Ubuntu server
(using cygwin, so everything is somewhere under
Hi CJ,
Looks like I overlook a few lines in the spark shell case. It appears that
spark shell
explicitly overwrites
https://github.com/apache/spark/blob/f4f46dec5ae1da48738b9b650d3de155b59c4674/repl/src/main/scala/org/apache/spark/repl/SparkILoop.scala#L955
spark.home to whatever SPARK_HOME is
Not sure that was what I want. I tried to run Spark Shell on a machine other
than the master and got the same error. The 192 was suppose to be a
simple shell script change that alters SPARK_HOME before submitting jobs.
Too bad it wasn't there anymore.
The build described in the pull request
Hi C.J.,
The PR Yana pointed out seems to fix this. However, it is not merged in
master yet, so for now I would recommend that you try the following
workaround: set spark.home to the executor's /path/to/spark. I provided
more detail here:
Andrew,
Thanks for replying. I did the following and the result was still the same.
1. Added spark.home /root/spark-1.0.0 to local conf/spark-defaults.conf,
where /root was the place in the cluster where I put Spark.
2. Ran bin/spark-shell --master