Repository: spark
Updated Branches:
  refs/heads/branch-1.6 1fbca4120 -> 881f2544e


[SPARK-12345][MESOS] Properly filter out SPARK_HOME in the Mesos REST server

Fix problem with #10332, this one should fix Cluster mode on Mesos

Author: Iulian Dragos <jagua...@gmail.com>

Closes #10359 from dragos/issue/fix-spark-12345-one-more-time.

(cherry picked from commit 8184568810e8a2e7d5371db2c6a0366ef4841f70)
Signed-off-by: Kousuke Saruta <saru...@oss.nttdata.co.jp>


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/881f2544
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/881f2544
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/881f2544

Branch: refs/heads/branch-1.6
Commit: 881f2544e13679c185a7c34ddb82e885aaa79813
Parents: 1fbca41
Author: Iulian Dragos <jagua...@gmail.com>
Authored: Fri Dec 18 03:19:31 2015 +0900
Committer: Kousuke Saruta <saru...@oss.nttdata.co.jp>
Committed: Fri Dec 18 03:37:43 2015 +0900

----------------------------------------------------------------------
 .../scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/881f2544/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala 
b/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala
index 7c01ae4..196338f 100644
--- 
a/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala
+++ 
b/core/src/main/scala/org/apache/spark/deploy/rest/mesos/MesosRestServer.scala
@@ -99,7 +99,7 @@ private[mesos] class MesosSubmitRequestServlet(
     // look for files in SPARK_HOME instead. We only need the ability to 
specify where to find
     // spark-submit script which user can user spark.executor.home or 
spark.home configurations
     // (SPARK-12345).
-    val environmentVariables = 
request.environmentVariables.filter(!_.equals("SPARK_HOME"))
+    val environmentVariables = 
request.environmentVariables.filterKeys(!_.equals("SPARK_HOME"))
     val name = 
request.sparkProperties.get("spark.app.name").getOrElse(mainClass)
 
     // Construct driver description


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to