Repository: zeppelin
Updated Branches:
  refs/heads/master 68512b9cd -> 92f244ef7


ZEPPELIN-3633. ZeppelinContext Not Found in yarn-cluster Mode

### What is this PR for?
This issue is due to classpath in cluster mode. Because in cluster mode, driver 
run in the node of yarn cluster which don't have zeppelin installed. This PR 
fix this issue by updating the classpath of spark repl.

### What type of PR is it?
[Bug Fix ]

### Todos
* [ ] - Task

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-3633

### How should this be tested?
* Manually tested. Unfortunately this only happens on multiple node cluster, no 
unit test can be added for this scenario.

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jeff Zhang <zjf...@apache.org>

Closes #3181 from zjffdu/ZEPPELIN-3633 and squashes the following commits:

e800037d9 [Jeff Zhang] ZEPPELIN-3633. ZeppelinContext Not Found in yarn-cluster 
Mode


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/92f244ef
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/92f244ef
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/92f244ef

Branch: refs/heads/master
Commit: 92f244ef7e1902e51dbec6b759152341992d834c
Parents: 68512b9
Author: Jeff Zhang <zjf...@apache.org>
Authored: Wed Sep 19 11:21:50 2018 +0800
Committer: Jeff Zhang <zjf...@apache.org>
Committed: Wed Sep 26 08:52:36 2018 +0800

----------------------------------------------------------------------
 spark/interpreter/figure/unnamed-chunk-1-1.png    | Bin 0 -> 407541 bytes
 .../spark/BaseSparkScalaInterpreter.scala         |  10 +++++++++-
 2 files changed, 9 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/92f244ef/spark/interpreter/figure/unnamed-chunk-1-1.png
----------------------------------------------------------------------
diff --git a/spark/interpreter/figure/unnamed-chunk-1-1.png 
b/spark/interpreter/figure/unnamed-chunk-1-1.png
new file mode 100644
index 0000000..6f03c95
Binary files /dev/null and b/spark/interpreter/figure/unnamed-chunk-1-1.png 
differ

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/92f244ef/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
----------------------------------------------------------------------
diff --git 
a/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
 
b/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
index a73630a..2cbda93 100644
--- 
a/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
+++ 
b/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
@@ -372,8 +372,16 @@ abstract class BaseSparkScalaInterpreter(val conf: 
SparkConf,
     val sparkJars = conf.getOption("spark.jars").map(_.split(","))
       .map(_.filter(_.nonEmpty)).toSeq.flatten
     val depJars = depFiles.asScala.filter(_.endsWith(".jar"))
-    val result = sparkJars ++ depJars
+    // add zeppelin spark interpreter jar
+    val zeppelinInterpreterJarURL = 
getClass.getProtectionDomain.getCodeSource.getLocation
+    // zeppelinInterpreterJarURL might be a folder when under unit testing
+    val result = if (new File(zeppelinInterpreterJarURL.getFile).isDirectory) {
+      sparkJars ++ depJars
+    } else {
+      sparkJars ++ depJars ++ Seq(zeppelinInterpreterJarURL.getFile)
+    }
     conf.set("spark.jars", result.mkString(","))
+    LOGGER.debug("User jar for spark repl: " + conf.get("spark.jars"))
     result
   }
 

Reply via email to