[ 
https://issues.apache.org/jira/browse/SPARK-12345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15059937#comment-15059937
 ] 

Saisai Shao commented on SPARK-12345:
-------------------------------------

I think by default Spark Mesos implementation will ship all the environment 
variables to the remote nodes, which includes {{SPARK_HOME}}, and Mesos itself 
will invoke the spark application through scripts, and inside the scripts we 
honor if {{SPARK_HOME}} is already set, so that's the problem.

Basically, I think there're two sides we could fix:

1. We should not expose {{SPARK_HOME}} to the environment if it is not set 
specifically. Otherwise cases like here will potentially have problem.
2. Spark on Mesos should not blindly ship all the environment variables to the 
remote side. The best way for Spark on Mesos is to invoke the Java program like 
what YARN did currently, not rely on scripts.

> Mesos cluster mode is broken
> ----------------------------
>
>                 Key: SPARK-12345
>                 URL: https://issues.apache.org/jira/browse/SPARK-12345
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.6.0
>            Reporter: Andrew Or
>            Assignee: Apache Spark
>            Priority: Critical
>
> The same setup worked in 1.5.2 but is now failing for 1.6.0-RC2.
> The driver is confused about where SPARK_HOME is. It resolves 
> `mesos.executor.uri` or `spark.mesos.executor.home` relative to the 
> filesystem where the driver runs, which is wrong.
> {code}
> I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0
> I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave 
> 130bdc39-44e7-4256-8c22-602040d337f1-S1
> bin/spark-submit: line 27: 
> /Users/dragos/workspace/Spark/dev/rc-tests/spark-1.6.0-bin-hadoop2.6/bin/spark-class:
>  No such file or directory
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to