[ 
https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14150695#comment-14150695
 ] 

Andrew Davidson commented on SPARK-922:
---------------------------------------

I must have missed something. I am running iPython notebook over a ssh tunnel. 
I am still running using the old version. I made sure to

export PYSPARK_PYTHON=python2.7

I also tried
export PYSPARK_PYTHON=/usr/bin/python2.7\

import IPython
print IPython.sys_info()
{'commit_hash': '858d539',
 'commit_source': 'installation',
 'default_encoding': 'UTF-8',
 'ipython_path': 
'/usr/lib/python2.6/site-packages/ipython-0.13.2-py2.6.egg/IPython',
 'ipython_version': '0.13.2',
 'os_name': 'posix',
 'platform': 'Linux-3.4.37-40.44.amzn1.x86_64-x86_64-with-glibc2.2.5',
 'sys_executable': '/usr/bin/python2.6',
 'sys_platform': 'linux2',
 'sys_version': '2.6.9 (unknown, Sep 13 2014, 00:25:11) \n[GCC 4.8.2 20140120 
(Red Hat 4.8.2-16)]'}


here is how I am launching iPython notebook. I am running as the ec2-user
IPYTHON_OPTS="notebook --pylab inline --no-browser --port=7000" 
$SPARK_HOME/bin/pyspark

Bellow are all the upgrade commands I ran 

Any idea what I missed?

Andy

yum install -y pssh
yum install -y python27 python27-devel
pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel
wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | 
python27
pssh -h /root/spark-ec2/slaves "wget 
https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27"
easy_install-2.7 pip
pssh -h /root/spark-ec2/slaves easy_install-2.7 pip
pip2.7 install numpy
pssh -t0 -h /root/spark-ec2/slaves pip2.7 install numpy
pip2.7 install ipython[all]
printf "\n# Set Spark Python version\nexport 
PYSPARK_PYTHON=/usr/bin/python2.7\n" >> /root/spark/conf/spark-env.sh
source /root/spark/conf/spark-env.sh



> Update Spark AMI to Python 2.7
> ------------------------------
>
>                 Key: SPARK-922
>                 URL: https://issues.apache.org/jira/browse/SPARK-922
>             Project: Spark
>          Issue Type: Task
>          Components: EC2, PySpark
>    Affects Versions: 0.9.0, 0.9.1, 1.0.0
>            Reporter: Josh Rosen
>             Fix For: 1.2.0
>
>
> Many Python libraries only support Python 2.7+, so we should make Python 2.7 
> the default Python on the Spark AMIs.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to