Hi Andy,

You may be interested in https://github.com/apache/spark/pull/2651, a
recent pull request of mine which cleans up / simplifies the configuration
of PySpark's Python executables.  For instance, it makes it much easier to
control which Python options are passed when launching the PySpark drivers
and workers.

- Josh

On Fri, Oct 10, 2014 at 5:24 PM, Andy Davidson <
a...@santacruzintegration.com> wrote:

> Hi
>
> I am running spark on an ec2 cluster. I need to update python to 2.7. I
> have been following the directions on
> http://nbviewer.ipython.org/gist/JoshRosen/6856670
>
> https://issues.apache.org/jira/browse/SPARK-922
>
>
> I noticed that when I start a shell using pyspark, I correctly got
> python2.7, how ever when I tried to start a notebook I got python2.6
>
>
>
> change
>
> exec ipython $IPYTHON_OPTS
>
> to
>
>  exec ipython2 $IPYTHON_OPTS
>
>
> One clean way to resolve this would be to add another environmental
> variable like PYSPARK_PYTHON
>
>
> Andy
>
>
>
> P.s. Matplotlab does not upgrade because of dependency problems. I’ll let
> you know once I get this resolved
>
>
>
>

Reply via email to