After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter;
despite we have org.apache.zeppelin.spark.*PySparkInterpreter*
listed first in zeppelin.interpreters.

zeppelin.interpreters in zeppelin-site.xml:

<property>
>   <name>zeppelin.interpreters</name>
>
> <value>org.apache.zeppelin.spark.PySparkInterpreter,org.apache.zeppelin.spark.SparkInterpreter
> ...
> </property>



Any ideas how to fix this?


Thanks,
Ruslan

Reply via email to