Repository: spark
Updated Branches:
  refs/heads/branch-1.1 092121e47 -> 935bffe3b


[SPARK-2608][Core] Fixed command line option passing issue over Mesos via 
SPARK_EXECUTOR_OPTS

This is another try after #2145 to fix 
[SPARK-2608](https://issues.apache.org/jira/browse/SPARK-2608).

### Basic Idea

The basic idea is to pass `extraJavaOpts` and `extraLibraryPath` together via 
environment variable `SPARK_EXECUTOR_OPTS`. This variable is recognized by 
`spark-class` and not used anywhere else. In this way, we still launch Mesos 
executors with `spark-class`/`spark-executor`, but avoids the executor side 
Spark home issue.

### Known Issue

Quoted string with spaces is not allowed in either `extraJavaOpts` or 
`extraLibraryPath` when using Spark over Mesos. The reason is that Mesos passes 
the whole command line as a single string argument to `sh -c` to start the 
executor, and this makes shell string escaping non-trivial to handle. This 
should be fixed in a later release.

### Background

Classes in package `org.apache.spark.deploy` shouldn't be used as they assume 
Spark is deployed in standalone mode, and give wrong executor side Spark home 
directory. Please refer to comments in #2145 for more details.

Author: Cheng Lian <lian.cs....@gmail.com>

Closes #2161 from liancheng/mesos-fix-with-env-var and squashes the following 
commits:

ba59190 [Cheng Lian] Added fine grained Mesos executor support
1174076 [Cheng Lian] Draft fix for CoarseMesosSchedulerBackend


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/935bffe3
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/935bffe3
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/935bffe3

Branch: refs/heads/branch-1.1
Commit: 935bffe3bf6c91a42288bff8c1ec69fecb41a769
Parents: 092121e
Author: Cheng Lian <lian.cs....@gmail.com>
Authored: Wed Aug 27 12:39:21 2014 -0700
Committer: Patrick Wendell <pwend...@gmail.com>
Committed: Wed Aug 27 12:39:21 2014 -0700

----------------------------------------------------------------------
 .../cluster/mesos/CoarseMesosSchedulerBackend.scala   | 14 ++++++++++----
 .../cluster/mesos/MesosSchedulerBackend.scala         | 14 ++++++++++++++
 2 files changed, 24 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/935bffe3/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
index f017250..8c7cb07 100644
--- 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
+++ 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/CoarseMesosSchedulerBackend.scala
@@ -122,6 +122,12 @@ private[spark] class CoarseMesosSchedulerBackend(
     val extraLibraryPath = conf.getOption(libraryPathOption).map(p => 
s"-Djava.library.path=$p")
     val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
 
+    environment.addVariables(
+      Environment.Variable.newBuilder()
+        .setName("SPARK_EXECUTOR_OPTS")
+        .setValue(extraOpts)
+        .build())
+
     sc.executorEnvs.foreach { case (key, value) =>
       environment.addVariables(Environment.Variable.newBuilder()
         .setName(key)
@@ -140,16 +146,16 @@ private[spark] class CoarseMesosSchedulerBackend(
     if (uri == null) {
       val runScript = new File(sparkHome, "./bin/spark-class").getCanonicalPath
       command.setValue(
-        "\"%s\" org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s 
%s %s %d".format(
-          runScript, extraOpts, driverUrl, offer.getSlaveId.getValue, 
offer.getHostname, numCores))
+        "\"%s\" org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s 
%s %d".format(
+          runScript, driverUrl, offer.getSlaveId.getValue, offer.getHostname, 
numCores))
     } else {
       // Grab everything to the first '.'. We'll use that and '*' to
       // glob the directory "correctly".
       val basename = uri.split('/').last.split('.').head
       command.setValue(
         ("cd %s*; " +
-          "./bin/spark-class 
org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %s %d")
-          .format(basename, extraOpts, driverUrl, offer.getSlaveId.getValue,
+          "./bin/spark-class 
org.apache.spark.executor.CoarseGrainedExecutorBackend %s %s %s %d")
+          .format(basename, driverUrl, offer.getSlaveId.getValue,
             offer.getHostname, numCores))
       command.addUris(CommandInfo.URI.newBuilder().setValue(uri))
     }

http://git-wip-us.apache.org/repos/asf/spark/blob/935bffe3/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
----------------------------------------------------------------------
diff --git 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
index c717e7c..e84ce09 100644
--- 
a/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
+++ 
b/core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
@@ -90,6 +90,20 @@ private[spark] class MesosSchedulerBackend(
       "Spark home is not set; set it through the spark.home system " +
       "property, the SPARK_HOME environment variable or the SparkContext 
constructor"))
     val environment = Environment.newBuilder()
+    sc.conf.getOption("spark.executor.extraClassPath").foreach { cp =>
+      environment.addVariables(
+        
Environment.Variable.newBuilder().setName("SPARK_CLASSPATH").setValue(cp).build())
+    }
+    val extraJavaOpts = sc.conf.getOption("spark.executor.extraJavaOptions")
+    val extraLibraryPath = 
sc.conf.getOption("spark.executor.extraLibraryPath").map { lp =>
+      s"-Djava.library.path=$lp"
+    }
+    val extraOpts = Seq(extraJavaOpts, extraLibraryPath).flatten.mkString(" ")
+    environment.addVariables(
+      Environment.Variable.newBuilder()
+        .setName("SPARK_EXECUTOR_OPTS")
+        .setValue(extraOpts)
+        .build())
     sc.executorEnvs.foreach { case (key, value) =>
       environment.addVariables(Environment.Variable.newBuilder()
         .setName(key)


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to