My notebook:

%pyspark
from pyspark.sql.types import BooleanType
sqlContext.udf.register("is_empty",lambda x : True if not x else False, 
BooleanType())
sqlContext.sql("select is_empty(job_functions), job_functions from test_table  
limit 10").show() 


It can import modules, but throw exceptions after job submitted.




> On Dec 8, 2015, at 5:04 PM, Amjad ALSHABANI <ashshab...@gmail.com> wrote:
> 
> Hello,
> 
> Are you sure that you ve installed the module pyspark.
> 
> Please check your spark installation directory if you could see the python 
> sub-directory 
> Amjad
> 
> On Dec 8, 2015 9:55 AM, "Fengdong Yu" <fengdo...@everstring.com 
> <mailto:fengdo...@everstring.com>> wrote:
> Hi
> 
> I am using Zeppelin-0.5.5 with Spark 1.5.2
> 
> It cannot find pyspark module.
> 
> 
> Error from python worker:
>  /usr/local/bin/python: No module named pyspark
> PYTHONPATH was:
> 
> 
> 
> I’ve configured pyspark in zeppelin-env.sh:
> 
> export 
> PYTHONPATH=$SPARK_HOME/python:$SPARK_HOME/python/lib/py4j-0.8.2.1-src.zip:$SPARK_HOME/python/lib/pyspark.zip
> 
> 
> any others I skipped? Thanks
> 
> 
> 

Reply via email to