Change default interpreter to pyspark

2016-07-18 Thread Jayant Raj
Zeppelin has the Scala interpreter assigned as the default for Spark notebooks. This default setting creates an additional step if you are to write code using PySpark. You will need to insert a %pyspark at the beginning of each row of the notebook, for Zeppelin to understand that this is PySpark co

Re: Change default interpreter to pyspark

2016-07-18 Thread Ahyoung Ryu
Hi Jayant, I tested as you said, and can't change the default interpreter as well. So I created issue for this in ZEPPELIN-1209 . Thanks for reporting the issue. Best regards, Ahyoung 2016년 7월 19일 (화) 오전 6:54, Jayant Raj 님이 작성: > Zeppelin has