Fwd: Spark 1.x - End of life

2017-10-24 Thread Ismaël Mejía
Thanks for your answer Matei. I agree that a more explicit maintenance
policy is needed (even for the 2.x releases). I did not immediately
find anything about this in the website, so I ended up assuming the
information of the wikipedia article that says that the 1.6.x line is
still maintained.

I see that Spark as an open source project can get updates if the
community brings them in, but it is probably also a good idea to be
clear about the expectations for the end users. I suppose some users
who can migrate to version 2 won’t do it if there is still support
(notice that ‘support’ can be tricky considering how different
companies re-package/maintain Spark but this is a different
discussion). Anyway it would be great to have this defined somewhere.
Maybe worth a discussion on dev@.

On Thu, Oct 19, 2017 at 11:20 PM, Matei Zaharia  wrote:
> Hi Ismael,
>
> It depends on what you mean by “support”. In general, there won’t be new 
> feature releases for 1.X (e.g. Spark 1.7) because all the new features are 
> being added to the master branch. However, there is always room for bug fix 
> releases if there is a catastrophic bug, and committers can make those at any 
> time. In general though, I’d recommend moving workloads to Spark 2.x. We 
> tried to make the migration as easy as possible (a few APIs changed, but not 
> many), and 2.x has been out for a long time now and is widely used.
>
> We should perhaps write a more explicit maintenance policy, but all of this 
> is run based on what committers want to work on; if someone thinks that 
> there’s a serious enough issue in 1.6 to update it, they can put together a 
> new release. It does help to hear from users about this though, e.g. if you 
> think there’s a significant issue that people are missing.
>
> Matei
>
>> On Oct 19, 2017, at 5:20 AM, Ismaël Mejía  wrote:
>>
>> Hello,
>>
>> I noticed that some of the (Big Data / Cloud Managed) Hadoop
>> distributions are starting to (phase out / deprecate) Spark 1.x and I
>> was wondering if the Spark community has already decided when will it
>> end the support for Spark 1.x. I ask this also considering that the
>> latest release in the series is already almost one year old. Any idea
>> on this ?
>>
>> Thanks,
>> Ismaël
>>
>> -
>> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>>
>

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Re: Spark 1.x - End of life

2017-10-19 Thread Matei Zaharia
Hi Ismael,

It depends on what you mean by “support”. In general, there won’t be new 
feature releases for 1.X (e.g. Spark 1.7) because all the new features are 
being added to the master branch. However, there is always room for bug fix 
releases if there is a catastrophic bug, and committers can make those at any 
time. In general though, I’d recommend moving workloads to Spark 2.x. We tried 
to make the migration as easy as possible (a few APIs changed, but not many), 
and 2.x has been out for a long time now and is widely used.

We should perhaps write a more explicit maintenance policy, but all of this is 
run based on what committers want to work on; if someone thinks that there’s a 
serious enough issue in 1.6 to update it, they can put together a new release. 
It does help to hear from users about this though, e.g. if you think there’s a 
significant issue that people are missing.

Matei

> On Oct 19, 2017, at 5:20 AM, Ismaël Mejía  wrote:
> 
> Hello,
> 
> I noticed that some of the (Big Data / Cloud Managed) Hadoop
> distributions are starting to (phase out / deprecate) Spark 1.x and I
> was wondering if the Spark community has already decided when will it
> end the support for Spark 1.x. I ask this also considering that the
> latest release in the series is already almost one year old. Any idea
> on this ?
> 
> Thanks,
> Ismaël
> 
> -
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
> 


-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org



Spark 1.x - End of life

2017-10-19 Thread Ismaël Mejía
Hello,

I noticed that some of the (Big Data / Cloud Managed) Hadoop
distributions are starting to (phase out / deprecate) Spark 1.x and I
was wondering if the Spark community has already decided when will it
end the support for Spark 1.x. I ask this also considering that the
latest release in the series is already almost one year old. Any idea
on this ?

Thanks,
Ismaël

-
To unsubscribe e-mail: user-unsubscr...@spark.apache.org