[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7
[ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169331#comment-14169331 ] Nicholas Chammas edited comment on SPARK-922 at 10/13/14 2:19 PM: -- [~joshrosen] - Do you mean [this script|https://github.com/mesos/spark-ec2/blob/v4/create_image.sh]? It doesn't seem to have anything related to Python 2.7. Anyway, what I meant was if you were open to holding off on updating the Spark AMIs until we had also figured out how to automate that process per [SPARK-3821]. I should have something for that as soon as this week or next. was (Author: nchammas): [~joshrosen] - Do you mean [this script|https://github.com/mesos/spark-ec2/blob/v4/create_image.sh]? I doesn't seem to have anything related to Python 2.7. Anyway, what I meant was if you were open to holding off on updating the Spark AMIs until we had also figured out how to automate that process per [SPARK-3821]. I should have something for that as soon as this week or next. > Update Spark AMI to Python 2.7 > -- > > Key: SPARK-922 > URL: https://issues.apache.org/jira/browse/SPARK-922 > Project: Spark > Issue Type: Task > Components: EC2, PySpark >Affects Versions: 0.9.0, 0.9.1, 1.0.0, 1.1.0 >Reporter: Josh Rosen > > Many Python libraries only support Python 2.7+, so we should make Python 2.7 > the default Python on the Spark AMIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7
[ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14150695#comment-14150695 ] Andrew Davidson edited comment on SPARK-922 at 9/29/14 7:05 PM: here is how I am launching iPython notebook. I am running as the ec2-user IPYTHON_OPTS="notebook --pylab inline --no-browser --port=7000" $SPARK_HOME/bin/pyspark Bellow are all the upgrade commands I ran I ran into a small problem the ipython magic %matplotlib inline creates an error, you can work around this by commenting it out. Andy yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -t0 -h /root/spark-ec2/slaves pip2.7 install numpy pip2.7 install ipython[all] printf "\n# Set Spark Python version\nexport PYSPARK_PYTHON=/usr/bin/python2.7\n" >> /root/spark/conf/spark-env.sh source /root/spark/conf/spark-env.sh was (Author: aedwip): here is how I am launching iPython notebook. I am running as the ec2-user IPYTHON_OPTS="notebook --pylab inline --no-browser --port=7000" $SPARK_HOME/bin/pyspark Bellow are all the upgrade commands I ran Any idea what I missed? Andy yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -t0 -h /root/spark-ec2/slaves pip2.7 install numpy pip2.7 install ipython[all] printf "\n# Set Spark Python version\nexport PYSPARK_PYTHON=/usr/bin/python2.7\n" >> /root/spark/conf/spark-env.sh source /root/spark/conf/spark-env.sh > Update Spark AMI to Python 2.7 > -- > > Key: SPARK-922 > URL: https://issues.apache.org/jira/browse/SPARK-922 > Project: Spark > Issue Type: Task > Components: EC2, PySpark >Affects Versions: 0.9.0, 0.9.1, 1.0.0 >Reporter: Josh Rosen > Fix For: 1.2.0 > > > Many Python libraries only support Python 2.7+, so we should make Python 2.7 > the default Python on the Spark AMIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7
[ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14150695#comment-14150695 ] Andrew Davidson edited comment on SPARK-922 at 9/29/14 7:03 PM: here is how I am launching iPython notebook. I am running as the ec2-user IPYTHON_OPTS="notebook --pylab inline --no-browser --port=7000" $SPARK_HOME/bin/pyspark Bellow are all the upgrade commands I ran Any idea what I missed? Andy yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -t0 -h /root/spark-ec2/slaves pip2.7 install numpy pip2.7 install ipython[all] printf "\n# Set Spark Python version\nexport PYSPARK_PYTHON=/usr/bin/python2.7\n" >> /root/spark/conf/spark-env.sh source /root/spark/conf/spark-env.sh was (Author: aedwip): I must have missed something. I am running iPython notebook over a ssh tunnel. I am still running using the old version. I made sure to export PYSPARK_PYTHON=python2.7 I also tried export PYSPARK_PYTHON=/usr/bin/python2.7\ import IPython print IPython.sys_info() {'commit_hash': '858d539', 'commit_source': 'installation', 'default_encoding': 'UTF-8', 'ipython_path': '/usr/lib/python2.6/site-packages/ipython-0.13.2-py2.6.egg/IPython', 'ipython_version': '0.13.2', 'os_name': 'posix', 'platform': 'Linux-3.4.37-40.44.amzn1.x86_64-x86_64-with-glibc2.2.5', 'sys_executable': '/usr/bin/python2.6', 'sys_platform': 'linux2', 'sys_version': '2.6.9 (unknown, Sep 13 2014, 00:25:11) \n[GCC 4.8.2 20140120 (Red Hat 4.8.2-16)]'} here is how I am launching iPython notebook. I am running as the ec2-user IPYTHON_OPTS="notebook --pylab inline --no-browser --port=7000" $SPARK_HOME/bin/pyspark Bellow are all the upgrade commands I ran Any idea what I missed? Andy yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -t0 -h /root/spark-ec2/slaves pip2.7 install numpy pip2.7 install ipython[all] printf "\n# Set Spark Python version\nexport PYSPARK_PYTHON=/usr/bin/python2.7\n" >> /root/spark/conf/spark-env.sh source /root/spark/conf/spark-env.sh > Update Spark AMI to Python 2.7 > -- > > Key: SPARK-922 > URL: https://issues.apache.org/jira/browse/SPARK-922 > Project: Spark > Issue Type: Task > Components: EC2, PySpark >Affects Versions: 0.9.0, 0.9.1, 1.0.0 >Reporter: Josh Rosen > Fix For: 1.2.0 > > > Many Python libraries only support Python 2.7+, so we should make Python 2.7 > the default Python on the Spark AMIs. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7
[ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14098824#comment-14098824 ] Josh Rosen edited comment on SPARK-922 at 8/15/14 6:10 PM: --- Updated script, which also updates numpy: {code} yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -h /root/spark-ec2/slaves pip2.7 -t0 install numpy {code} And to check that numpy is successfully installed: {code} pssh -h /root/spark-ec2/slaves --inline-stdout 'python2.7 -c "import numpy; print numpy"' {code} was (Author: joshrosen): Updated script, which also updates numpy: {code} yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -h /root/spark-ec2/slaves pip2.7 -t0 install numpy {code} > Update Spark AMI to Python 2.7 > -- > > Key: SPARK-922 > URL: https://issues.apache.org/jira/browse/SPARK-922 > Project: Spark > Issue Type: Task > Components: EC2, PySpark >Affects Versions: 0.9.0, 0.9.1, 1.0.0 >Reporter: Josh Rosen > Fix For: 1.1.0 > > > Many Python libraries only support Python 2.7+, so we should make Python 2.7 > the default Python on the Spark AMIs. -- This message was sent by Atlassian JIRA (v6.2#6252) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Comment Edited] (SPARK-922) Update Spark AMI to Python 2.7
[ https://issues.apache.org/jira/browse/SPARK-922?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14098824#comment-14098824 ] Josh Rosen edited comment on SPARK-922 at 8/15/14 6:05 PM: --- Updated script, which also updates numpy: {code} yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -h /root/spark-ec2/slaves pip2.7 -t0 install numpy {code} was (Author: joshrosen): Updated script, which also updates numpy: {code} yum install -y pssh yum install -y python27 python27-devel pssh -h /root/spark-ec2/slaves yum install -y python27 python27-devel wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27 pssh -h /root/spark-ec2/slaves "wget https://bitbucket.org/pypa/setuptools/raw/bootstrap/ez_setup.py -O - | python27" easy_install-2.7 pip pssh -h /root/spark-ec2/slaves easy_install-2.7 pip pip2.7 install numpy pssh -h /root/spark-ec2/slaves pip2.7 install numpy {code} > Update Spark AMI to Python 2.7 > -- > > Key: SPARK-922 > URL: https://issues.apache.org/jira/browse/SPARK-922 > Project: Spark > Issue Type: Task > Components: EC2, PySpark >Affects Versions: 0.9.0, 0.9.1, 1.0.0 >Reporter: Josh Rosen > Fix For: 1.1.0 > > > Many Python libraries only support Python 2.7+, so we should make Python 2.7 > the default Python on the Spark AMIs. -- This message was sent by Atlassian JIRA (v6.2#6252) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org