Repository: zeppelin
Updated Branches:
  refs/heads/branch-0.8 6fed0628a -> 4a8735530


ZEPPELIN-3633. ZeppelinContext Not Found in yarn-cluster Mode

### What is this PR for?
This issue is due to classpath in cluster mode. Because in cluster mode, driver 
run in the node of yarn cluster which don't have zeppelin installed. This PR 
fix this issue by updating the classpath of spark repl.

### What type of PR is it?
[Bug Fix ]

### Todos
* [ ] - Task

### What is the Jira issue?
* https://issues.apache.org/jira/browse/ZEPPELIN-3633

### How should this be tested?
* Manually tested. Unfortunately this only happens on multiple node cluster, no 
unit test can be added for this scenario.

### Screenshots (if appropriate)

### Questions:
* Does the licenses files need update? No
* Is there breaking changes for older versions? No
* Does this needs documentation? No

Author: Jeff Zhang <zjf...@apache.org>

Closes #3181 from zjffdu/ZEPPELIN-3633 and squashes the following commits:

e800037d9 [Jeff Zhang] ZEPPELIN-3633. ZeppelinContext Not Found in yarn-cluster 
Mode

(cherry picked from commit 92f244ef7e1902e51dbec6b759152341992d834c)
Signed-off-by: Jeff Zhang <zjf...@apache.org>


Project: http://git-wip-us.apache.org/repos/asf/zeppelin/repo
Commit: http://git-wip-us.apache.org/repos/asf/zeppelin/commit/4a873553
Tree: http://git-wip-us.apache.org/repos/asf/zeppelin/tree/4a873553
Diff: http://git-wip-us.apache.org/repos/asf/zeppelin/diff/4a873553

Branch: refs/heads/branch-0.8
Commit: 4a87355309b54325f70dc7b690e8aa04823f4245
Parents: 6fed062
Author: Jeff Zhang <zjf...@apache.org>
Authored: Wed Sep 19 11:21:50 2018 +0800
Committer: Jeff Zhang <zjf...@apache.org>
Committed: Wed Sep 26 08:53:30 2018 +0800

----------------------------------------------------------------------
 spark/interpreter/figure/unnamed-chunk-1-1.png    | Bin 0 -> 407541 bytes
 .../spark/BaseSparkScalaInterpreter.scala         |  10 +++++++++-
 2 files changed, 9 insertions(+), 1 deletion(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/zeppelin/blob/4a873553/spark/interpreter/figure/unnamed-chunk-1-1.png
----------------------------------------------------------------------
diff --git a/spark/interpreter/figure/unnamed-chunk-1-1.png 
b/spark/interpreter/figure/unnamed-chunk-1-1.png
new file mode 100644
index 0000000..6f03c95
Binary files /dev/null and b/spark/interpreter/figure/unnamed-chunk-1-1.png 
differ

http://git-wip-us.apache.org/repos/asf/zeppelin/blob/4a873553/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
----------------------------------------------------------------------
diff --git 
a/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
 
b/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
index 734a303..b26b834 100644
--- 
a/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
+++ 
b/spark/spark-scala-parent/src/main/scala/org/apache/zeppelin/spark/BaseSparkScalaInterpreter.scala
@@ -372,8 +372,16 @@ abstract class BaseSparkScalaInterpreter(val conf: 
SparkConf,
     val sparkJars = conf.getOption("spark.jars").map(_.split(","))
       .map(_.filter(_.nonEmpty)).toSeq.flatten
     val depJars = depFiles.asScala.filter(_.endsWith(".jar"))
-    val result = sparkJars ++ depJars
+    // add zeppelin spark interpreter jar
+    val zeppelinInterpreterJarURL = 
getClass.getProtectionDomain.getCodeSource.getLocation
+    // zeppelinInterpreterJarURL might be a folder when under unit testing
+    val result = if (new File(zeppelinInterpreterJarURL.getFile).isDirectory) {
+      sparkJars ++ depJars
+    } else {
+      sparkJars ++ depJars ++ Seq(zeppelinInterpreterJarURL.getFile)
+    }
     conf.set("spark.jars", result.mkString(","))
+    LOGGER.debug("User jar for spark repl: " + conf.get("spark.jars"))
     result
   }
 

Reply via email to