Yes, some organization do lag behind the current release by sometimes a
significant amount.  That is a bug, not a feature -- and one that increases
pressure toward fragmentation of the Spark community.  To date, that hasn't
been a significant problem, and I think that is mainly because the factors
motivating a decision not to upgrade in a timely fashion are almost
entirely internal to a lagging organization -- Spark itself has tried to
present minimal impediments to upgrading as soon as a new release is
available.

Changing the supported Java and Scala versions within the same quarter in
which the next version is scheduled for release would represent more than a
minimal impediment, and would increase fragmentation pressure to a degree
with which I am not entirely comfortable.

On Mon, Apr 11, 2016 at 12:10 PM, Daniel Siegmann <
daniel.siegm...@teamaol.com> wrote:

> On Wed, Apr 6, 2016 at 2:57 PM, Mark Hamstra <m...@clearstorydata.com>
> wrote:
>
> ... My concern is that either of those options will take more resources
>> than some Spark users will have available in the ~3 months remaining before
>> Spark 2.0.0, which will cause fragmentation into Spark 1.x and Spark 2.x
>> user communities. ...
>>
>
> It's not as if everyone is going to switch over to Spark 2.0.0 on release
> day anyway. It's not that unusual to see posts on the user list from people
> who are a version or two behind. I think a few extra months lag time will
> be OK for a major version.
>
> Besides, in my experience if you give people more time to upgrade, they're
> just going to kick the can down the road a ways and you'll eventually end
> up with the same problem. I don't see a good reason to *not* drop Java 7
> and Scala 2.10 support with Spark 2.0.0. Time to bite the bullet. If
> companies stick with Spark 1.x and find themselves missing the new features
> in the 2.x line, that will be a good motivation for them to upgrade.
>
> ~Daniel Siegmann
>

Reply via email to