Repository: spark
Updated Branches:
  refs/heads/master e6f7bfcfb -> bf04a390e


[SPARK-2392] Executors should not start their own HTTP servers

Executors currently start their own unused HTTP file servers. This is because 
we use the same SparkEnv class for both executors and drivers, and we do not 
distinguish this case.

In the longer term, we should separate out SparkEnv for the driver and SparkEnv 
for the executors.

Author: Andrew Or <andrewo...@gmail.com>

Closes #1335 from andrewor14/executor-http-server and squashes the following 
commits:

46ef263 [Andrew Or] Start HTTP server only on the driver


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/bf04a390
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/bf04a390
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/bf04a390

Branch: refs/heads/master
Commit: bf04a390e40d60aa7fcc551501d25f3f9d38377c
Parents: e6f7bfc
Author: Andrew Or <andrewo...@gmail.com>
Authored: Tue Jul 8 17:35:31 2014 -0700
Committer: Reynold Xin <r...@apache.org>
Committed: Tue Jul 8 17:35:31 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/SparkEnv.scala | 14 ++++++++++----
 1 file changed, 10 insertions(+), 4 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/bf04a390/core/src/main/scala/org/apache/spark/SparkEnv.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/SparkEnv.scala 
b/core/src/main/scala/org/apache/spark/SparkEnv.scala
index 2b636b0..8f70744 100644
--- a/core/src/main/scala/org/apache/spark/SparkEnv.scala
+++ b/core/src/main/scala/org/apache/spark/SparkEnv.scala
@@ -79,7 +79,7 @@ class SparkEnv (
 
   private[spark] def stop() {
     pythonWorkers.foreach { case(key, worker) => worker.stop() }
-    httpFileServer.stop()
+    Option(httpFileServer).foreach(_.stop())
     mapOutputTracker.stop()
     shuffleManager.stop()
     broadcastManager.stop()
@@ -228,9 +228,15 @@ object SparkEnv extends Logging {
 
     val cacheManager = new CacheManager(blockManager)
 
-    val httpFileServer = new HttpFileServer(securityManager)
-    httpFileServer.initialize()
-    conf.set("spark.fileserver.uri",  httpFileServer.serverUri)
+    val httpFileServer =
+      if (isDriver) {
+        val server = new HttpFileServer(securityManager)
+        server.initialize()
+        conf.set("spark.fileserver.uri",  server.serverUri)
+        server
+      } else {
+        null
+      }
 
     val metricsSystem = if (isDriver) {
       MetricsSystem.createMetricsSystem("driver", conf, securityManager)

Reply via email to