it will take time before all libraries that spark depends on are available
for scala 2.12, so we are not talking spark 2.1.x and probably also not
2.2.x for scala 2.12

it technically makes sense to drop java 7 and scala 2.10 around the same
time as scala 2.12 is introduced

we are still heavily dependent on java 7 (and python 2.6 if we used python
but we dont). i am surprised to see new clusters installed in last few
months (CDH and HDP latest versions) to still be running on java 7. even
getting java 8 installed on these clusters so we can use them in yarn is
often not an option. it beats me as to why this is still happening.

we do not use scala 2.10 at all anymore.

On Tue, Oct 25, 2016 at 12:31 PM, Ofir Manor <ofir.ma...@equalum.io> wrote:

> I think that 2.1 should include a visible deprecation message about Java
> 7, Scala 2.10 and older Hadoop versions (plus python if there is a
> consensus on that), to give users / admins early warning, followed by
> dropping them from trunk for 2.2 once 2.1 is released.
> Personally, we use only Scala 2.11 on JDK8.
> Cody - Scala 2.12 will likely be released before Spark 2.1, maybe even
> later this week: http://scala-lang.org/news/2.12.0-RC2
>
> Ofir Manor
>
> Co-Founder & CTO | Equalum
>
> Mobile: +972-54-7801286 | Email: ofir.ma...@equalum.io
>
> On Tue, Oct 25, 2016 at 7:28 PM, Cody Koeninger <c...@koeninger.org>
> wrote:
>
>> I think only supporting 1 version of scala at any given time is not
>> sufficient, 2 probably is ok.
>>
>> I.e. don't drop 2.10 before 2.12 is out + supported
>>
>> On Tue, Oct 25, 2016 at 10:56 AM, Sean Owen <so...@cloudera.com> wrote:
>> > The general forces are that new versions of things to support emerge,
>> and
>> > are valuable to support, but have some cost to support in addition to
>> old
>> > versions. And the old versions become less used and therefore less
>> valuable
>> > to support, and at some point it tips to being more cost than value.
>> It's
>> > hard to judge these costs and benefits.
>> >
>> > Scala is perhaps the trickiest one because of the general mutual
>> > incompatibilities across minor versions. The cost of supporting multiple
>> > versions is high, and a third version is about to arrive. That's
>> probably
>> > the most pressing question. It's actually biting with some regularity
>> now,
>> > with compile errors on 2.10.
>> >
>> > (Python I confess I don't have an informed opinion about.)
>> >
>> > Java, Hadoop are not as urgent because they're more
>> backwards-compatible.
>> > Anecdotally, I'd be surprised if anyone today would "upgrade" to Java 7
>> or
>> > an old Hadoop version. And I think that's really the question. Even if
>> one
>> > decided to drop support for all this in 2.1.0, it would not mean people
>> > can't use Spark with these things. It merely means they can't
>> necessarily
>> > use Spark 2.1.x. This is why we have maintenance branches for 1.6.x,
>> 2.0.x.
>> >
>> > Tying Scala 2.11/12 support to Java 8 might make sense.
>> >
>> > In fact, I think that's part of the reason that an update in master,
>> perhaps
>> > 2.1.x, could be overdue, because it actually is just the beginning of
>> the
>> > end of the support burden. If you want to stop dealing with these in ~6
>> > months they need to stop being supported in minor branches by right
>> about
>> > now.
>> >
>> >
>> >
>> >
>> > On Tue, Oct 25, 2016 at 4:47 PM Mark Hamstra <m...@clearstorydata.com>
>> > wrote:
>> >>
>> >> What's changed since the last time we discussed these issues, about 7
>> >> months ago?  Or, another way to formulate the question: What are the
>> >> threshold criteria that we should use to decide when to end Scala 2.10
>> >> and/or Java 7 support?
>> >>
>> >> On Tue, Oct 25, 2016 at 8:36 AM, Sean Owen <so...@cloudera.com> wrote:
>> >>>
>> >>> I'd like to gauge where people stand on the issue of dropping support
>> for
>> >>> a few things that were considered for 2.0.
>> >>>
>> >>> First: Scala 2.10. We've seen a number of build breakages this week
>> >>> because the PR builder only tests 2.11. No big deal at this stage,
>> but, it
>> >>> did cause me to wonder whether it's time to plan to drop 2.10 support,
>> >>> especially with 2.12 coming soon.
>> >>>
>> >>> Next, Java 7. It's reasonably old and out of public updates at this
>> >>> stage. It's not that painful to keep supporting, to be honest. It
>> would
>> >>> simplify some bits of code, some scripts, some testing.
>> >>>
>> >>> Hadoop versions: I think the the general argument is that most anyone
>> >>> would be using, at the least, 2.6, and it would simplify some code
>> that has
>> >>> to reflect to use not-even-that-new APIs. It would remove some
>> moderate
>> >>> complexity in the build.
>> >>>
>> >>>
>> >>> "When" is a tricky question. Although it's a little aggressive for
>> minor
>> >>> releases, I think these will all happen before 3.x regardless. 2.1.0
>> is not
>> >>> out of the question, though coming soon. What about ... 2.2.0?
>> >>>
>> >>>
>> >>> Although I tend to favor dropping support, I'm mostly asking for
>> current
>> >>> opinions.
>> >>
>> >>
>> >
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>
>

Reply via email to