Hi Andy,
You may be interested in https://github.com/apache/spark/pull/2651, a
recent pull request of mine which cleans up / simplifies the configuration
of PySpark's Python executables. For instance, it makes it much easier to
control which Python options are passed when launching the PySpark
Hi
I am running spark on an ec2 cluster. I need to update python to 2.7. I have
been following the directions on
http://nbviewer.ipython.org/gist/JoshRosen/6856670
https://issues.apache.org/jira/browse/SPARK-922
I noticed that when I start a shell using pyspark, I correctly got
python2.7, how