[Warning: not spark+python specific information]

It's recommended that you should explicitly call out python3 in a case like
this (see PEP-0394
<https://www.python.org/dev/peps/pep-0394/#recommendation>, and SO
<https://stackoverflow.com/questions/6908143/should-i-put-shebang-in-python-scripts-and-what-form-should-it-take>).
Your environment is typical: *python* is often a pointer to python2 for
tooling compatibility reasons (other tools or scripts that expect they're
going to get python2 when they call *python*), and you should use python3
to use the new version. What *python* points to will change over time, so
it's recommended to use *python2 *if explicitly depending on that.

More generally: It's common/recommended to use a virtual environment +
explicitly stated versions of Python and dependencies, rather than system
Python, so that *python* means exactly what you intend it to. I know very
little about the Spark python dev stack and how challenging it may be to do
this, so please take this with a dose of naiveté.

- Oli


On Fri, Jul 17, 2020 at 9:58 AM Sean Owen <sro...@gmail.com> wrote:

> So, we are on Python 3 entirely now right?
> It might be just my local Mac env, but "/usr/bin/env python" uses
> Python 2 on my mac.
> Some scripts write "/usr/bin/env python3" now. Should that be the case
> in all scripts?
> Right now the merge script doesn't work for me b/c it was just updated
> to be Python 3 only.
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to