Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.

https://github.com/apache/spark/pull/22967

We can decide later if we want to change the alternative Scala version
to 2.13 and drop 2.11 if we just want to support two Scala versions at
one time.

Thanks.

Sincerely,

DB Tsai
----------------------------------------------------------
Web: https://www.dbtsai.com
PGP Key ID: 0x5CED8B896A6BDFA0
On Wed, Nov 7, 2018 at 11:18 AM Sean Owen <sro...@gmail.com> wrote:
>
> It's not making 2.12 the default, but not dropping 2.11. Supporting
> 2.13 could mean supporting 3 Scala versions at once, which I claim is
> just too much. I think the options are likely:
>
> - Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
> default. Add 2.13 support in 3.x and drop 2.11 in the same release
> - Deprecate 2.11 right now via announcement and/or Spark 2.4.1 soon.
> Drop 2.11 support in Spark 3.0, and support only 2.12.
> - (same as above, but add Spark 2.13 support if possible for Spark 3.0)
>
>
> On Wed, Nov 7, 2018 at 12:32 PM Mark Hamstra <m...@clearstorydata.com> wrote:
> >
> > I'm not following "exclude Scala 2.13". Is there something inherent in 
> > making 2.12 the default Scala version in Spark 3.0 that would prevent us 
> > from supporting the option of building with 2.13?
> >
> > On Tue, Nov 6, 2018 at 5:48 PM Sean Owen <sro...@gmail.com> wrote:
> >>
> >> That's possible here, sure. The issue is: would you exclude Scala 2.13
> >> support in 3.0 for this, if it were otherwise ready to go?
> >> I think it's not a hard rule that something has to be deprecated
> >> previously to be removed in a major release. The notice is helpful,
> >> sure, but there are lots of ways to provide that notice to end users.
> >> Lots of things are breaking changes in a major release. Or: deprecate
> >> in Spark 2.4.1, if desired?
> >>
> >> On Tue, Nov 6, 2018 at 7:36 PM Wenchen Fan <cloud0...@gmail.com> wrote:
> >> >
> >> > We make Scala 2.11 the default one in Spark 2.0, then drop Scala 2.10 in 
> >> > Spark 2.3. Shall we follow it and drop Scala 2.11 at some point of Spark 
> >> > 3.x?
> >> >
> >> > On Wed, Nov 7, 2018 at 8:55 AM Reynold Xin <r...@databricks.com> wrote:
> >> >>
> >> >> Have we deprecated Scala 2.11 already in an existing release?
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to