Another, 

I only deployed Spark on the Zeppelin host, not deploy Spark on Yarn cluster 
nodes.







> On Dec 8, 2015, at 7:50 PM, Fengdong Yu <fengdo...@everstring.com> wrote:
> 
> System.getenv().get("PYTHONPATH”)
> res6: String = 
> /usr/spark/current/python/lib/py4j-0.8.2.1-src.zip:/usr/spark/current/python/::/usr/local/lib/python2.7/site-packages/:/usr/local/lib/python2.7/site-packages/
> 
> 
> 1)pyspark does exist under spark_home
> 
> 2)my zeppelin-env
> export SPARK_HOME=/usr/spark/current
> export HADOOP_CONF_DIR=/etc/hadoop/conf
> 
> export ZEPPELIN_PORT=10008 
> export SPARK_SUBMIT_OPTIONS="—jars xxxxxx”
> 
> 3) Can you try run like the following: (I also run %pyspark successfully if I 
> don’t import modules from pyspark)
> 
> %pyspark
> 
> from pyspark.sql.types import BooleanType
> sqlContext.udf.register("is_empty",lambda x : True if not x else False, 
> BooleanType())
> sqlContext.sql("select is_empty(name) as name from  test_table  limit 
> 10").show()
> 
> 
> 
>> On Dec 8, 2015, at 7:43 PM, moon soo Lee <m...@apache.org 
>> <mailto:m...@apache.org>> wrote:
>> 
>> Tried with 0.5.5-incubating release after adding SPARK_1_5_2 in 
>> spark/src/main/java/org/apache/zeppelin/spark/SparkVersion.java.
>> 
>> My conf/zeppelin-env.sh has only SPARK_HOME that points spark 1.5.2 
>> distribution. And i could able to run %pyspark without any problem.
>> 
>> when you run
>> 
>> System.getenv("PYTHONPATH")
>> 
>> in the notebook, what do you see? can you check those files and dirs are 
>> exists?
>> 
>> Thanks,
>> moon
>> 
>> On Tue, Dec 8, 2015 at 6:22 PM Fengdong Yu <fengdo...@everstring.com 
>> <mailto:fengdo...@everstring.com>> wrote:
>> I tried. the same error now.
>> 
>> I even tried remove spark.yarn.jar in interpreter.json, it still the same 
>> error.
>> 
>> 
>> 
>>> On Dec 8, 2015, at 5:07 PM, moon soo Lee <leemoon...@gmail.com 
>>> <mailto:leemoon...@gmail.com>> wrote:
>>> 
>>> Can you not try to set PYTHONPATH but only SPARK_HOME?
>>> 
>>> Thanks,
>>> moon
>>> 
>>> 
>>> On 2015년 12월 8일 (화) at 오후 6:04 Amjad ALSHABANI <ashshab...@gmail.com 
>>> <mailto:ashshab...@gmail.com>> wrote:
>>> Hello,
>>> 
>>> Are you sure that you ve installed the module pyspark.
>>> 
>>> Please check your spark installation directory if you could see the python 
>>> sub-directory 
>>> Amjad
>>> 
>>> On Dec 8, 2015 9:55 AM, "Fengdong Yu" <fengdo...@everstring.com 
>>> <mailto:fengdo...@everstring.com>> wrote:
>>> Hi
>>> 
>>> I am using Zeppelin-0.5.5 with Spark 1.5.2
>>> 
>>> It cannot find pyspark module.
>>> 
>>> 
>>> Error from python worker:
>>>  /usr/local/bin/python: No module named pyspark
>>> PYTHONPATH was:
>>> 
>>> 
>>> 
>>> I’ve configured pyspark in zeppelin-env.sh:
>>> 
>>> export 
>>> PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$SPARK_HOME/python/lib/pyspark.zip
>>> 
>>> 
>>> any others I skipped? Thanks
>>> 
>>> 
>>> 
>> 
> 

Reply via email to