Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Ruslan Dautkhanov
I got a lucky jira number :-) https://issues.apache.org/jira/browse/ZEPPELIN-1777 Thank you Jeff. -- Ruslan Dautkhanov On Thu, Dec 8, 2016 at 10:50 PM, Jeff Zhang wrote: > hmm, I think so, please file a ticket for it. > > > > Ruslan Dautkhanov 于2016年12月9日周五 下午1:49写道: > >> Hi Jeff, >> >> Wh

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Jeff Zhang
hmm, I think so, please file a ticket for it. Ruslan Dautkhanov 于2016年12月9日周五 下午1:49写道: > Hi Jeff, > > When I made pySpark as default - it works as expected; > except Setting UI. See screenshot below. > > Notice it shows %spark twice. > First time as default. 2nd one is not. > It should have be

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-12-08 Thread Ruslan Dautkhanov
Hi Jeff, When I made pySpark as default - it works as expected; except Setting UI. See screenshot below. Notice it shows %spark twice. First time as default. 2nd one is not. It should have been %pyspark (default), %spark, .. as I made pyspark default. Is this a new bug in 0.7? [image: Inline im

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-30 Thread Ruslan Dautkhanov
Jeff, Yep, that was it. Thank you! -- Ruslan Dautkhanov On Wed, Nov 30, 2016 at 7:34 PM, Jeff Zhang wrote: > Hi Ruslan, > > I miss another thing, You also need to delete file conf/interpreter.json > which store the original setting. Otherwise the original setting is always > loaded. > > >

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-30 Thread Jeff Zhang
Hi Ruslan, I miss another thing, You also need to delete file conf/interpreter.json which store the original setting. Otherwise the original setting is always loaded. Ruslan Dautkhanov 于2016年12月1日周四 上午1:03写道: > Got it. Thanks Jeff. > > I've downloaded > > https://github.com/apache/zeppelin/blob

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-30 Thread Ruslan Dautkhanov
Got it. Thanks Jeff. I've downloaded https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/ interpreter-setting.json and saved to $ZEPPELIN_HOME/interpreter/spark/ Then Moved "defaultInterpreter": true, from json section "className": "org.apache.zeppelin.spark.SparkInterpret

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-29 Thread Jeff Zhang
No, you don't need to create that directory, it should be in $ZEPPELIN_HOME/interpreter/spark Ruslan Dautkhanov 于2016年11月30日周三 下午12:12写道: > Thank you Jeff. > > Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf > or in $ZEPPELIN_HOME directory? > So zeppelin.interpreters in

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-29 Thread Ruslan Dautkhanov
Thank you Jeff. Do I have to create interpreter/spark directory in $ZEPPELIN_HOME/conf or in $ZEPPELIN_HOME directory? So zeppelin.interpreters in zeppelin-site.xml is deprecated in 0.7? Thanks! -- Ruslan Dautkhanov On Tue, Nov 29, 2016 at 6:54 PM, Jeff Zhang wrote: > The default interpret

Re: 0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-29 Thread Jeff Zhang
The default interpreter is now defined in interpreter-setting.json You can update the following file to make pyspark as the default interpreter and then copy it to folder interpreter/spark https://github.com/apache/zeppelin/blob/master/spark/src/main/resources/interpreter-setting.json Ruslan D

0.7.0 zeppelin.interpreters change: can't make pyspark be default Spark interperter

2016-11-29 Thread Ruslan Dautkhanov
After 0.6.2 -> 0.7 upgrade, pySpark isn't a default Spark interpreter; despite we have org.apache.zeppelin.spark.*PySparkInterpreter* listed first in zeppelin.interpreters. zeppelin.interpreters in zeppelin-site.xml: > zeppelin.interpreters > > org.apache.zeppelin.spark.PySparkInterpreter,org.