Repository: spark
Updated Branches:
  refs/heads/master 82fb5bfa7 -> a4470bc78


[SPARK-21673] Use the correct sandbox environment variable set by Mesos

## What changes were proposed in this pull request?
This change changes spark behavior to use the correct environment variable set 
by Mesos in the container on startup.

Author: Jake Charland <ja...@uber.com>

Closes #18894 from jakecharland/MesosSandbox.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/a4470bc7
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/a4470bc7
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/a4470bc7

Branch: refs/heads/master
Commit: a4470bc78ca5f5a090b6831a7cdca88274eb9afc
Parents: 82fb5bf
Author: Jake Charland <ja...@uber.com>
Authored: Tue May 22 08:06:15 2018 -0500
Committer: Sean Owen <sro...@gmail.com>
Committed: Tue May 22 08:06:15 2018 -0500

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/util/Utils.scala | 8 ++++----
 docs/configuration.md                                 | 2 +-
 2 files changed, 5 insertions(+), 5 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/a4470bc7/core/src/main/scala/org/apache/spark/util/Utils.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/util/Utils.scala 
b/core/src/main/scala/org/apache/spark/util/Utils.scala
index 13adaa9..f9191a5 100644
--- a/core/src/main/scala/org/apache/spark/util/Utils.scala
+++ b/core/src/main/scala/org/apache/spark/util/Utils.scala
@@ -810,15 +810,15 @@ private[spark] object Utils extends Logging {
       conf.getenv("SPARK_EXECUTOR_DIRS").split(File.pathSeparator)
     } else if (conf.getenv("SPARK_LOCAL_DIRS") != null) {
       conf.getenv("SPARK_LOCAL_DIRS").split(",")
-    } else if (conf.getenv("MESOS_DIRECTORY") != null && 
!shuffleServiceEnabled) {
+    } else if (conf.getenv("MESOS_SANDBOX") != null && !shuffleServiceEnabled) 
{
       // Mesos already creates a directory per Mesos task. Spark should use 
that directory
       // instead so all temporary files are automatically cleaned up when the 
Mesos task ends.
       // Note that we don't want this if the shuffle service is enabled 
because we want to
       // continue to serve shuffle files after the executors that wrote them 
have already exited.
-      Array(conf.getenv("MESOS_DIRECTORY"))
+      Array(conf.getenv("MESOS_SANDBOX"))
     } else {
-      if (conf.getenv("MESOS_DIRECTORY") != null && shuffleServiceEnabled) {
-        logInfo("MESOS_DIRECTORY available but not using provided Mesos 
sandbox because " +
+      if (conf.getenv("MESOS_SANDBOX") != null && shuffleServiceEnabled) {
+        logInfo("MESOS_SANDBOX available but not using provided Mesos sandbox 
because " +
           "spark.shuffle.service.enabled is enabled.")
       }
       // In non-Yarn mode (or for the driver in yarn-client mode), we cannot 
trust the user

http://git-wip-us.apache.org/repos/asf/spark/blob/a4470bc7/docs/configuration.md
----------------------------------------------------------------------
diff --git a/docs/configuration.md b/docs/configuration.md
index 8a1aace..fd2670c 100644
--- a/docs/configuration.md
+++ b/docs/configuration.md
@@ -208,7 +208,7 @@ of the most common options to set are:
     stored on disk. This should be on a fast, local disk in your system. It 
can also be a
     comma-separated list of multiple directories on different disks.
 
-    NOTE: In Spark 1.0 and later this will be overridden by SPARK_LOCAL_DIRS 
(Standalone, Mesos) or
+    NOTE: In Spark 1.0 and later this will be overridden by SPARK_LOCAL_DIRS 
(Standalone), MESOS_SANDBOX (Mesos) or
     LOCAL_DIRS (YARN) environment variables set by the cluster manager.
   </td>
 </tr>


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to