[ 
https://issues.apache.org/jira/browse/SPARK-12345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15188589#comment-15188589
 ] 

Eran Withana edited comment on SPARK-12345 at 3/10/16 3:20 AM:
---------------------------------------------------------------

is the resolution to this issue available in Spark 1.6.0 release? 

I just used Spark 1.6.0 and got the following error in mesos logs, when it 
tried to run the task

{code}
I0310 03:13:11.417009 131594 exec.cpp:132] Version: 0.23.1
I0310 03:13:11.419452 131601 exec.cpp:206] Executor registered on slave 
20160223-000314-3439362570-5050-631-S0
sh: 1: /usr/spark-1.6.0-bin-hadoop2.6/bin/spark-class: not found
{code}

To provide more context, here is my spark-submit script

{code}
$SPARK_HOME/bin/spark-submit \
 --class com.mycompany.SparkStarter \
 --master mesos://mesos-dispatcher:7077 \
 --name SparkStarterJob \
--driver-memory 1G \
 --executor-memory 4G \
--deploy-mode cluster \
 --total-executor-cores 1 \
 --conf 
spark.mesos.executor.docker.image=echinthaka/mesos-spark:0.23.1-1.6.0-2.6 \
 http://abc.com/spark-starter.jar
{code}


was (Author: eran.chinth...@gmail.com):
is the resolution to this issue available in Spark 1.6.0 release? 

I just used Spark 1.6.0 and got the following error in mesos logs, when it 
tried to run the task

{code}
I0310 03:13:11.417009 131594 exec.cpp:132] Version: 0.23.1
I0310 03:13:11.419452 131601 exec.cpp:206] Executor registered on slave 
20160223-000314-3439362570-5050-631-S0
sh: 1: /usr/spark-1.6.0-bin-hadoop2.6/bin/spark-class: not found
{code}

To provide more context, here is my spark-submit script

{code}
$SPARK_HOME/bin/spark-submit \
 `# main class to be run` \
 --class com.mycompany.SparkStarter \
 --master mesos://mesos-dispatcher:7077 \
 --name SparkStarterJob \
--driver-memory 1G \
 --executor-memory 4G \
--deploy-mode cluster \
 --total-executor-cores 1 \
 --conf 
spark.mesos.executor.docker.image=echinthaka/mesos-spark:0.23.1-1.6.0-2.6 \
 http://abc.com/spark-starter.jar
{code}

> Mesos cluster mode is broken
> ----------------------------
>
>                 Key: SPARK-12345
>                 URL: https://issues.apache.org/jira/browse/SPARK-12345
>             Project: Spark
>          Issue Type: Bug
>          Components: Mesos
>    Affects Versions: 1.6.0
>            Reporter: Andrew Or
>            Assignee: Timothy Chen
>            Priority: Critical
>             Fix For: 1.6.0
>
>
> The same setup worked in 1.5.2 but is now failing for 1.6.0-RC2.
> The driver is confused about where SPARK_HOME is. It resolves 
> `mesos.executor.uri` or `spark.mesos.executor.home` relative to the 
> filesystem where the driver runs, which is wrong.
> {code}
> I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0
> I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave 
> 130bdc39-44e7-4256-8c22-602040d337f1-S1
> bin/spark-submit: line 27: 
> /Users/dragos/workspace/Spark/dev/rc-tests/spark-1.6.0-bin-hadoop2.6/bin/spark-class:
>  No such file or directory
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to