Now that the switch to 2.12 by default has been made, it might be good
to have a serious discussion about dropping 2.11 altogether. Many of
the main arguments have already been talked about. But I don't
remember anyone mentioning how easy it would be to break the 2.11
build now.

For example, the following works fine in 2.12 but breaks in 2.11:

java.util.Arrays.asList("hi").stream().forEach(println)

We had a similar issue when we supported java 1.6 but the builds were
all on 1.7 by default. Every once in a while something would silently
break, because PR builds only check the default. And the jenkins
builds, which are less monitored, would stay broken for a while.

On Tue, Nov 6, 2018 at 11:13 AM DB Tsai <d_t...@apple.com> wrote:
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark 
> version will be 3.0, so it's a great time to discuss should we make Scala 
> 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to 
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work 
> per discussion in Scala community, 
> https://github.com/scala/scala-dev/issues/559#issuecomment-436160166
>
> We have initial support of Scala 2.12 in Spark 2.4. If we decide to make 
> Scala 2.12 as default for Spark 3.0 now, we will have ample time to work on 
> bugs and issues that we may run into.
>
> What do you think?
>
> Thanks,
>
> DB Tsai  |  Siri Open Source Technologies [not a contribution]  |   Apple, 
> Inc
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>


-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to