[ 
https://issues.apache.org/jira/browse/SPARK-13973?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15249705#comment-15249705
 ] 

Paul Shearer edited comment on SPARK-13973 at 4/20/16 12:09 PM:
----------------------------------------------------------------

Bottom line... I think IPYTHON=1 should either 

(1) mean what it appears to mean - IPython and not necessarily the notebook - 
or 
(2) be removed entirely as too confusing.

As the code now stands, the IPython shell user's pyspark is silently broken if 
they happen to have set a deprecated option when using an older version of 
pyspark. Fixable yes, but more confusing than necessary, and opposite to the 
intention of backwards compatibility.


was (Author: pshearer):
Bottom line... I think IPYTHON=1 should either 

(1) mean what it appears to mean - IPython and not necessarily the notebook - 
or 
(2) be removed entirely as too confusing.

As the code now stands, the IPython shell user's pyspark is silently broken if 
they happen to have set a deprecated option when using an older version of 
pyspark. Not exactly the definition of backwards compatibility.

> `ipython notebook` is going away...
> -----------------------------------
>
>                 Key: SPARK-13973
>                 URL: https://issues.apache.org/jira/browse/SPARK-13973
>             Project: Spark
>          Issue Type: Improvement
>          Components: PySpark
>         Environment: spark-1.6.1-bin-hadoop2.6
> Anaconda2-2.5.0-Linux-x86_64
>            Reporter: Bogdan Pirvu
>            Assignee: Rekha Joshi
>            Priority: Trivial
>             Fix For: 2.0.0
>
>
> Starting {{pyspark}} with following environment variables:
> {code:none}
> export IPYTHON=1
> export IPYTHON_OPTS="notebook --no-browser"
> {code}
> yields this warning
> {code:none}
> [TerminalIPythonApp] WARNING | Subcommand `ipython notebook` is deprecated 
> and will be removed in future versions.
> [TerminalIPythonApp] WARNING | You likely want to use `jupyter notebook`... 
> continue in 5 sec. Press Ctrl-C to quit now.
> {code}
> Changing line 52 from
> {code:none}
> PYSPARK_DRIVER_PYTHON="ipython"
> {code}
> to
> {code:none}
> PYSPARK_DRIVER_PYTHON="jupyter"
> {code}
> in https://github.com/apache/spark/blob/master/bin/pyspark works for me to 
> solve this issue, but I'm not sure if it's sustainable as I'm not familiar 
> with the rest of the code...
> This is the relevant part of my Python environment:
> {code:none}
> ipython                   4.1.2                    py27_0  
> ipython-genutils          0.1.0                     <pip>
> ipython_genutils          0.1.0                    py27_0  
> ipywidgets                4.1.1                    py27_0  
> ...
> jupyter                   1.0.0                    py27_1  
> jupyter-client            4.2.1                     <pip>
> jupyter-console           4.1.1                     <pip>
> jupyter-core              4.1.0                     <pip>
> jupyter_client            4.2.1                    py27_0  
> jupyter_console           4.1.1                    py27_0  
> jupyter_core              4.1.0                    py27_0
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to