I created https://issues.apache.org/jira/browse/SPARK-27884 to track the
work.

On Thu, May 30, 2019 at 2:18 AM Felix Cheung <felixcheun...@hotmail.com>
wrote:

> We don’t usually reference a future release on website
>
> > Spark website and state that Python 2 is deprecated in Spark 3.0
>
> I suspect people will then ask when is Spark 3.0 coming out then. Might
> need to provide some clarity on that.
>

We can say the "next major release in 2019" instead of Spark 3.0. Spark 3.0
timeline certainly requires a new thread to discuss.


>
>
> ------------------------------
> *From:* Reynold Xin <r...@databricks.com>
> *Sent:* Thursday, May 30, 2019 12:59:14 AM
> *To:* shane knapp
> *Cc:* Erik Erlandson; Mark Hamstra; Matei Zaharia; Sean Owen; Wenchen
> Fen; Xiangrui Meng; dev; user
> *Subject:* Re: Should python-2 be supported in Spark 3.0?
>
> +1 on Xiangrui’s plan.
>
> On Thu, May 30, 2019 at 7:55 AM shane knapp <skn...@berkeley.edu> wrote:
>
>> I don't have a good sense of the overhead of continuing to support
>>> Python 2; is it large enough to consider dropping it in Spark 3.0?
>>>
>>> from the build/test side, it will actually be pretty easy to continue
>> support for python2.7 for spark 2.x as the feature sets won't be expanding.
>>
>
>> that being said, i will be cracking a bottle of champagne when i can
>> delete all of the ansible and anaconda configs for python2.x.  :)
>>
>
On the development side, in a future release that drops Python 2 support we
can remove code that maintains python 2/3 compatibility and start using
python 3 only features, which is also quite exciting.


>
>> shane
>> --
>> Shane Knapp
>> UC Berkeley EECS Research / RISELab Staff Technical Lead
>> https://rise.cs.berkeley.edu
>>
>

Reply via email to