[ https://issues.apache.org/jira/browse/SPARK-12345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15065047#comment-15065047 ]
Andrew Or commented on SPARK-12345: ----------------------------------- For those who are following: There are 4 patches related to this issue that are merged in this order: (1) https://github.com/apache/spark/pull/10332 - doesn't actually work (2) https://github.com/apache/spark/pull/10359 - fixes #10332 to make it actually work (3) https://github.com/apache/spark/pull/10366 - fixes #10359, which broke HA (SPARK-12413) (4) https://github.com/apache/spark/pull/10329 - an alternative, more correct fix Patches (1), (2), and (3) are merged ONLY into branch-1.6. Patch (4) is merged ONLY in master. We have a different fix for branch-1.6 because it was an RC blocker and we wanted to minimize the scope of the changes there. However, patch (4) is a better fix, and so it exists in master for the longer term. > Mesos cluster mode is broken > ---------------------------- > > Key: SPARK-12345 > URL: https://issues.apache.org/jira/browse/SPARK-12345 > Project: Spark > Issue Type: Bug > Components: Mesos > Affects Versions: 1.6.0 > Reporter: Andrew Or > Assignee: Timothy Chen > Priority: Critical > Fix For: 1.6.0 > > > The same setup worked in 1.5.2 but is now failing for 1.6.0-RC2. > The driver is confused about where SPARK_HOME is. It resolves > `mesos.executor.uri` or `spark.mesos.executor.home` relative to the > filesystem where the driver runs, which is wrong. > {code} > I1215 15:00:39.411212 28032 exec.cpp:134] Version: 0.25.0 > I1215 15:00:39.413512 28037 exec.cpp:208] Executor registered on slave > 130bdc39-44e7-4256-8c22-602040d337f1-S1 > bin/spark-submit: line 27: > /Users/dragos/workspace/Spark/dev/rc-tests/spark-1.6.0-bin-hadoop2.6/bin/spark-class: > No such file or directory > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org