Add pyspark related environment variables

Project: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/repo
Commit: 
http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/commit/71571f00
Tree: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/tree/71571f00
Diff: http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/diff/71571f00

Branch: refs/heads/master
Commit: 71571f006c532104ff45b24e9b6081599b83ca51
Parents: 07ac4fb
Author: Lee moon soo <[email protected]>
Authored: Fri Mar 13 21:57:17 2015 +0900
Committer: Lee moon soo <[email protected]>
Committed: Fri Mar 13 21:57:17 2015 +0900

----------------------------------------------------------------------
 conf/zeppelin-env.sh.template | 5 +++++
 1 file changed, 5 insertions(+)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/incubator-zeppelin/blob/71571f00/conf/zeppelin-env.sh.template
----------------------------------------------------------------------
diff --git a/conf/zeppelin-env.sh.template b/conf/zeppelin-env.sh.template
index 704ffe5..48150df 100644
--- a/conf/zeppelin-env.sh.template
+++ b/conf/zeppelin-env.sh.template
@@ -15,3 +15,8 @@
 # Options read in YARN client mode
 # export SPARK_YARN_JAR          # Yarn executor needs spark-assembly-*.jar 
for running tasks in a yarn cluster.
 # export HADOOP_CONF_DIR         # yarn-site.xml is located in configuration 
directory in HADOOP_CONF_DIR.
+
+# Pyspark
+# To configure pyspark, you need to set spark distribution's path to 
'spark.home' property in Interpreter setting screen in Zeppelin GUI
+# export PYSPARK_PYTHON          # path to the python command. must be the 
same path on the driver(Zeppelin) and all workers.
+# export PYTHONPATH              # extra PYTHONPATH.

Reply via email to