+1 on removing Scala 2.11 support for 3.0 given Scala 2.11 is already EOL.
On Tue, Nov 20, 2018 at 2:53 PM Sean Owen wrote:
> PS: pull request at https://github.com/apache/spark/pull/23098
> Not going to merge it until there's clear agreement.
>
> On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue
PS: pull request at https://github.com/apache/spark/pull/23098
Not going to merge it until there's clear agreement.
On Tue, Nov 20, 2018 at 10:16 AM Ryan Blue wrote:
>
> +1 to removing 2.11 support for 3.0 and a PR.
>
> It sounds like having multiple Scala builds is just not feasible and I don't
+1 to removing 2.11 support for 3.0 and a PR.
It sounds like having multiple Scala builds is just not feasible and I
don't think this will be too disruptive for users since it is already a
breaking change.
On Tue, Nov 20, 2018 at 7:05 AM Sean Owen wrote:
> One more data point -- from looking
One more data point -- from looking at the SBT build yesterday, it
seems like most plugin updates require SBT 1.x. And both they and SBT
1.x seem to need Scala 2.12. And the new zinc also does.
Now, the current SBT and zinc and plugins all appear to work OK with
2.12 now, but updating will pretty
>
>
> Maintaining a separate PR builder for 2.11 isn't so bad
>
i actually beg to differ... it's more of a PITA than you might realize
managing more than one PRB (we have two already).
a much better solution would be for the test launching code either in the
PRB config, or scripts in the repo
I support dropping 2.11 support. My general logic is:
- 2.11 is EOL, and is all the more EOL in the middle of next year when
Spark 3 arrives
- I haven't heard of a critical dependency that has no 2.12 counterpart
- 2.11 users can stay on 2.4.x, which will be notionally supported
through, say, end
PR builds only check the default. And the jenkins
> builds, which are less monitored, would stay broken for a while.
>
> On Tue, Nov 6, 2018 at 11:13 AM DB Tsai wrote:
> >
> > We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so
make Scala
> 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work
> per discussion in Scala community,
> https://github.com/scala/scala-
This seems fine to me. At least we should be primarily testing against
2.12 now.
Shane will need to alter the current 2.12 master build to actually
test 2.11, but should be a trivial change.
On Thu, Nov 8, 2018 at 12:11 AM DB Tsai wrote:
>
> Based on the discussions, I created a PR that makes
Based on the discussions, I created a PR that makes Spark's default
Scala version as 2.12, and then Scala 2.11 will be the alternative
version. This implies that Scala 2.12 will be used by our CI builds
including pull request builds.
https://github.com/apache/spark/pull/22967
We can decide later
It's not making 2.12 the default, but not dropping 2.11. Supporting
2.13 could mean supporting 3 Scala versions at once, which I claim is
just too much. I think the options are likely:
- Support 2.11, 2.12 in Spark 3.0. Deprecate 2.11 and make 2.12 the
default. Add 2.13 support in 3.x and drop
Ok, got it -- it's really just an argument for not all of 2.11, 2.12 and
2.13 at the same time; always 2.12; now figure out when we stop 2.11
support and start 2.13 support.
On Wed, Nov 7, 2018 at 11:10 AM Sean Owen wrote:
> It's not making 2.12 the default, but not dropping 2.11. Supporting
>
I'm not following "exclude Scala 2.13". Is there something inherent in
making 2.12 the default Scala version in Spark 3.0 that would prevent us
from supporting the option of building with 2.13?
On Tue, Nov 6, 2018 at 5:48 PM Sean Owen wrote:
> That's possible here, sure. The issue is: would you
I spoke with the Scala team at Lightbend. They plan to do a 2.13-RC1
release in January and GA a few months later. Of course, nothing is ever
certain. What's the thinking for the Spark 3.0 timeline? If it's likely to
be late Q1 or in Q2, then it might make sense to add Scala 2.13 as an
alternative
That's possible here, sure. The issue is: would you exclude Scala 2.13
support in 3.0 for this, if it were otherwise ready to go?
I think it's not a hard rule that something has to be deprecated
previously to be removed in a major release. The notice is helpful,
sure, but there are lots of ways to
gt; >
>> > So to clarify, only scala 2.12 is supported in Spark 3?
>> >
>> >
>> > From: Ryan Blue
>> > Sent: Tuesday, November 6, 2018 1:24 PM
>> > To: d_t...@apple.com
>> > Cc: Sean Owen; Spark Dev List; cdelg...@apple.c
is otherwise attainable at
> >> the release of Spark 3.0, I wonder if that too argues for dropping
> >> 2.11 support.
> >>
> >> Finally I'll say that Spark itself isn't dropping 2.11 support for a
> >> while, no matter what; it still exists in the 2.4.x branc
e
> Sent: Tuesday, November 6, 2018 1:24 PM
> To: d_t...@apple.com
> Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
> Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
>
> +1 to Scala 2.12 as the default in Spark 3.0.
>
> On Tue, Nov 6, 2018 at 11:50
So to clarify, only scala 2.12 is supported in Spark 3?
From: Ryan Blue
Sent: Tuesday, November 6, 2018 1:24 PM
To: d_t...@apple.com
Cc: Sean Owen; Spark Dev List; cdelg...@apple.com
Subject: Re: Make Scala 2.12 as default Scala version in Spark 3.0
+1 to Scala
y on Spark 2.x, note.
>
> Sean
>
>
> On Tue, Nov 6, 2018 at 1:13 PM DB Tsai wrote:
>
>
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 as default Scala versi
default Scala version in Spark 2.0. Now, the next
>> Spark version will be 3.0, so it's a great time to discuss should we make
>> Scala 2.12 as default Scala version in Spark 3.0.
>>
>> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
&g
should we make Scala
> 2.12 as default Scala version in Spark 3.0.
>
> Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
> support JDK 11 in Scala 2.11 unless we're willing to sponsor the needed work
> per discussion in Scala community,
> https://github
+1 for making Scala 2.12 as default for Spark 3.0.
Bests,
Dongjoon.
On Tue, Nov 6, 2018 at 11:13 AM DB Tsai wrote:
> We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next
> Spark version will be 3.0, so it's a great time to discuss should we make
> Scala 2.12 a
We made Scala 2.11 as default Scala version in Spark 2.0. Now, the next Spark
version will be 3.0, so it's a great time to discuss should we make Scala 2.12
as default Scala version in Spark 3.0.
Scala 2.11 is EOL, and it came out 4.5 ago; as a result, it's unlikely to
support JDK 11 in Scala
24 matches
Mail list logo