Hi

I am running spark on an ec2 cluster. I need to update python to 2.7. I have
been following the directions on
http://nbviewer.ipython.org/gist/JoshRosen/6856670
https://issues.apache.org/jira/browse/SPARK-922



I noticed that when I start a shell using pyspark, I correctly got
python2.7, how ever when I tried to start a notebook I got python2.6





change

exec ipython $IPYTHON_OPTS

to

exec ipython2 $IPYTHON_OPTS



One clean way to resolve this would be to add another environmental variable
like PYSPARK_PYTHON



Andy





P.s. Matplotlab does not upgrade because of dependency problems. I¹ll let
you know once I get this resolved






Reply via email to