Author: moon
Date: Thu Nov 12 15:06:13 2015
New Revision: 1714064
URL: http://svn.apache.org/viewvc?rev=1714064&view=rev
Log:
[ZEPPELIN-407] Improve document on how to manage external libraries in spark
interpreter
Modified:
incubator/zeppelin/site/docs/install/install.html
incubator/zeppelin/site/docs/interpreter/spark.html
Modified: incubator/zeppelin/site/docs/install/install.html
URL:
http://svn.apache.org/viewvc/incubator/zeppelin/site/docs/install/install.html?rev=1714064&r1=1714063&r2=1714064&view=diff
==============================================================================
--- incubator/zeppelin/site/docs/install/install.html (original)
+++ incubator/zeppelin/site/docs/install/install.html Thu Nov 12 15:06:13 2015
@@ -25,7 +25,11 @@
<link rel="apple-touch-icon" href="images/apple-touch-icon.png">
<link rel="apple-touch-icon" sizes="72x72"
href="images/apple-touch-icon-72x72.png">
<link rel="apple-touch-icon" sizes="114x114"
href="images/apple-touch-icon-114x114.png">
- -->
+ -->
+
+ <!-- Js -->
+ <script src="https://code.jquery.com/jquery-1.10.2.min.js"></script>
+ <script
src="/assets/themes/zeppelin/bootstrap/js/bootstrap.min.js"></script>
<!-- atom & rss feed -->
<link href="/atom.xml" type="application/atom+xml" rel="alternate"
title="Sitewide ATOM Feed">
@@ -160,6 +164,30 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
@@ -265,6 +293,30 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
<li><a href="/download.html">Download</a></li>
@@ -396,14 +448,6 @@
<td>JVM Options</td>
</table>
-<h4>Add jars, files</h4>
-
-<p>spark.jars, spark.files property in <em>ZEPPELIN_JAVA_OPTS</em> adds jars,
files into SparkContext.
-for example, </p>
-<div class="highlight"><pre><code class="text language-text"
data-lang="text">ZEPPELIN_JAVA_OPTS="-Dspark.jars=/mylib1.jar,/mylib2.jar
-Dspark.files=/myfile1.dat,/myfile2.dat"
-</code></pre></div>
-<p>or you can do it dynamically with <a
href="../interpreter/spark.html#dependencyloading">dependency loader</a></p>
-
<h2>Start/Stop</h2>
<h4>Start Zeppelin</h4>
@@ -442,9 +486,6 @@ Note that port <strong>8081</strong> als
- <script src="https://code.jquery.com/jquery-1.10.2.min.js"></script>
-
- <script
src="/assets/themes/zeppelin/bootstrap/js/bootstrap.min.js"></script>
</body>
</html>
Modified: incubator/zeppelin/site/docs/interpreter/spark.html
URL:
http://svn.apache.org/viewvc/incubator/zeppelin/site/docs/interpreter/spark.html?rev=1714064&r1=1714063&r2=1714064&view=diff
==============================================================================
--- incubator/zeppelin/site/docs/interpreter/spark.html (original)
+++ incubator/zeppelin/site/docs/interpreter/spark.html Thu Nov 12 15:06:13 2015
@@ -25,7 +25,11 @@
<link rel="apple-touch-icon" href="images/apple-touch-icon.png">
<link rel="apple-touch-icon" sizes="72x72"
href="images/apple-touch-icon-72x72.png">
<link rel="apple-touch-icon" sizes="114x114"
href="images/apple-touch-icon-114x114.png">
- -->
+ -->
+
+ <!-- Js -->
+ <script src="https://code.jquery.com/jquery-1.10.2.min.js"></script>
+ <script
src="/assets/themes/zeppelin/bootstrap/js/bootstrap.min.js"></script>
<!-- atom & rss feed -->
<link href="/atom.xml" type="application/atom+xml" rel="alternate"
title="Sitewide ATOM Feed">
@@ -168,6 +172,22 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
@@ -281,6 +301,22 @@
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
+
<li><a href="/download.html">Download</a></li>
@@ -378,7 +414,11 @@ Spark Interpreter group, which consisted
<br />
<br /></p>
-<h3>Dependency loading</h3>
+<h3>Dependency Management</h3>
+
+<p>There are two ways to load external library in spark interpreter. First is
using Zeppelin's %dep interpreter and second is loading Spark
properties.</p>
+
+<h4>1. Dynamic Dependency Loading via %dep interpreter</h4>
<p>When your code requires external library, instead of doing
download/copy/restart Zeppelin, you can easily do following jobs using %dep
interpreter.</p>
@@ -389,7 +429,8 @@ Spark Interpreter group, which consisted
<li>Automatically add libraries to SparkCluster (You can turn off)</li>
</ul>
-<p>Dep interpreter leverages scala environment. So you can write any Scala
code here.</p>
+<p>Dep interpreter leverages scala environment. So you can write any Scala
code here.
+Note that %dep interpreter should be used before %spark, %pyspark, %sql.</p>
<p>Here's usages.</p>
<div class="highlight"><pre><code class="scala language-scala"
data-lang="scala"><span class="o">%</span><span class="n">dep</span>
@@ -401,6 +442,9 @@ Spark Interpreter group, which consisted
<span class="c1">// add maven snapshot repository</span>
<span class="n">z</span><span class="o">.</span><span
class="n">addRepo</span><span class="o">(</span><span
class="s">"RepoName"</span><span class="o">).</span><span
class="n">url</span><span class="o">(</span><span
class="s">"RepoURL"</span><span class="o">).</span><span
class="n">snapshot</span><span class="o">()</span>
+<span class="c1">// add credentials for private maven repository</span>
+<span class="n">z</span><span class="o">.</span><span
class="n">addRepo</span><span class="o">(</span><span
class="s">"RepoName"</span><span class="o">).</span><span
class="n">url</span><span class="o">(</span><span
class="s">"RepoURL"</span><span class="o">).</span><span
class="n">username</span><span class="o">(</span><span
class="s">"username"</span><span class="o">).</span><span
class="n">password</span><span class="o">(</span><span
class="s">"password"</span><span class="o">)</span>
+
<span class="c1">// add artifact from filesystem</span>
<span class="n">z</span><span class="o">.</span><span
class="n">load</span><span class="o">(</span><span
class="s">"/path/to.jar"</span><span class="o">)</span>
@@ -421,7 +465,65 @@ Spark Interpreter group, which consisted
<span class="c1">// local() skips adding artifact to spark clusters (skipping
sc.addJar())</span>
<span class="n">z</span><span class="o">.</span><span
class="n">load</span><span class="o">(</span><span
class="s">"groupId:artifactId:version"</span><span
class="o">).</span><span class="n">local</span><span class="o">()</span>
</code></pre></div>
-<p>Note that %dep interpreter should be used before %spark, %pyspark, %sql.</p>
+<p><br /></p>
+
+<h4>2. Loading Spark Properties</h4>
+
+<p>Once <code>SPARK_HOME</code> is set in <code>conf/zeppelin-env.sh</code>,
Zeppelin uses <code>spark-submit</code> as spark interpreter runner.
<code>spark-submit</code> supports two ways to load configurations. The first
is command line options such as --master and Zeppelin can pass these options to
<code>spark-submit</code> by exporting <code>SPARK_SUBMIT_OPTIONS</code> in
conf/zeppelin-env.sh. Second is reading configuration options from
<code>SPARK_HOME/conf/spark-defaults.conf</code>. Spark properites that user
can set to distribute libraries are:</p>
+
+<table class="table-configuration">
+ <tr>
+ <th>spark-defaults.conf</th>
+ <th>SPARK_SUBMIT_OPTIONS</th>
+ <th>Applicable Interpreter</th>
+ <th>Description</th>
+ </tr>
+ <tr>
+ <td>spark.jars</td>
+ <td>--jars</td>
+ <td>%spark</td>
+ <td>Comma-separated list of local jars to include on the driver and
executor classpaths.</td>
+ </tr>
+ <tr>
+ <td>spark.jars.packages</td>
+ <td>--packages</td>
+ <td>%spark</td>
+ <td>Comma-separated list of maven coordinates of jars to include on the
driver and executor classpaths. Will search the local maven repo, then maven
central and any additional remote repositories given by --repositories. The
format for the coordinates should be groupId:artifactId:version.</td>
+ </tr>
+ <tr>
+ <td>spark.files</td>
+ <td>--files</td>
+ <td>%pyspark</td>
+ <td>Comma-separated list of files to be placed in the working directory of
each executor.</td>
+ </tr>
+</table>
+
+<p>Note that adding jar to pyspark is only availabe via %dep interpreter at
the moment</p>
+
+<p><br/>
+Here are few examples:</p>
+
+<h5>0.5.5 and later</h5>
+
+<ul>
+<li><p>SPARK_SUBMIT_OPTIONS in conf/zeppelin-env.sh</p>
+<div class="highlight"><pre><code class="text language-text"
data-lang="text">export SPARK_SUBMIT_OPTIONS="--packages
com.databricks:spark-csv_2.10:1.2.0 --jars /path/mylib1.jar,/path/mylib2.jar
--files /path/mylib1.py,/path/mylib2.zip,/path/mylib3.egg"
+</code></pre></div></li>
+<li><p>SPARK_HOME/conf/spark-defaults.conf</p>
+<div class="highlight"><pre><code class="text language-text"
data-lang="text">spark.jars /path/mylib1.jar,/path/mylib2.jar
+spark.jars.packages com.databricks:spark-csv_2.10:1.2.0
+spark.files /path/mylib1.py,/path/mylib2.egg,/path/mylib3.zip
+</code></pre></div></li>
+</ul>
+
+<h5>0.5.0</h5>
+
+<ul>
+<li><p>ZEPPELIN_JAVA_OPTS in conf/zeppelin-env.sh</p>
+<div class="highlight"><pre><code class="text language-text"
data-lang="text">export
ZEPPELIN_JAVA_OPTS="-Dspark.jars=/path/mylib1.jar,/path/mylib2.jar
-Dspark.files=/path/myfile1.dat,/path/myfile2.dat"
+</code></pre></div>
+<p><br /></p></li>
+</ul>
<p><a name="zeppelincontext"> </a>
<br />
@@ -501,9 +603,6 @@ select * from ${table=defaultTableName}
- <script src="https://code.jquery.com/jquery-1.10.2.min.js"></script>
-
- <script
src="/assets/themes/zeppelin/bootstrap/js/bootstrap.min.js"></script>
</body>
</html>