[ 
https://issues.apache.org/jira/browse/SPARK-12345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15060153#comment-15060153
 ] 

Luc Bourlier edited comment on SPARK-12345 at 12/16/15 3:33 PM:
----------------------------------------------------------------

I have almost the same fix, which is the same logic: do not carry 
{{SPARK_HOME}} information across systems. But I changed it in SparkSubmit side:

https://github.com/skyluc/spark/commit/5b6eaa5bf936ef42d46b53564816d62b2aa44e86

I'm running tests to check that Mesos is working fine with those changes.


was (Author: skyluc):
I have almost the same fix, which is the same logic: do not carry `SPARK_HOME` 
information across systems. But I changed it in SparkSubmit side:

https://github.com/skyluc/spark/commit/5b6eaa5bf936ef42d46b53564816d62b2aa44e86

I'm running tests to check that Mesos is working fine with those changes.

> Mesos cluster mode is broken
> ----------------------------
>
>                 Key: SPARK-12345
>                 URL: https://issues.apache.org/jira/browse/SPARK-12345
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.6.0
>            Reporter: Andrew Or
>            Assignee: Apache Spark
>            Priority: Critical
>
> The same setup worked in 1.5.2 but is now failing for 1.6.0-RC2.
> The driver is confused about where SPARK_HOME is. It resolves 
> `mesos.executor.uri` or `spark.mesos.executor.home` relative to the 
> filesystem where the driver runs, which is wrong.
> {code}
> I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0
> I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave 
> 130bdc39-44e7-4256-8c22-602040d337f1-S1
> bin/spark-submit: line 27: 
> /Users/dragos/workspace/Spark/dev/rc-tests/spark-1.6.0-bin-hadoop2.6/bin/spark-class:
>  No such file or directory
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to