I am afraid currently there's no way to make ipython as default of
%pyspark, but you can use %ipyspark to use ipython without this warning
message.

But making ipython as default is on my plan, for now I try to keep backward
compatibility as much as possible, so only use ipython when it is
available, otherwise still use the old python interpreter implementation.
I will change ipython as default and the original python implementation as
fallback when ipython interpreter become much more mature.




Ruslan Dautkhanov <dautkha...@gmail.com>于2017年12月11日周一 下午1:20写道:

> Getting "IPython is available, use IPython for PySparkInterpreter"
> warning after starting pyspark interpreter.
>
> How do I default %pyspark to ipython?
>
> Tried to change to
> "class": "org.apache.zeppelin.spark.PySparkInterpreter",
> to
> "class": "org.apache.zeppelin.spark.IPySparkInterpreter",
> in interpreter.json but this gets overwritten back to PySparkInterpreter.
>
> Also tried to change to zeppelin.pyspark.python to ipython with no luck
> too.
>
> Is there is a documented way to default pyspark interpreter to ipython?
> Glanced over PR-2474 but can't quickly get what I am missing.
>
>
> Thanks.
>
>
>

Reply via email to