Re: Should we consider a Spark 2.1.1 release?

2017-03-20 Thread Ted Yu
Timur: Mind starting a new thread ? I have the same question as you have. > On Mar 20, 2017, at 11:34 AM, Timur Shenkao wrote: > > Hello guys, > > Spark benefits from stable versions not frequent ones. > A lot of people still have 1.6.x in production. Those who wants the

Re: Should we consider a Spark 2.1.1 release?

2017-03-20 Thread Holden Karau
I think questions around how long the 1.6 series will be supported are really important, but probably belong in a different thread than the 2.1.1 release discussion. On Mon, Mar 20, 2017 at 11:34 AM Timur Shenkao wrote: > Hello guys, > > Spark benefits from stable versions

Re: Should we consider a Spark 2.1.1 release?

2017-03-20 Thread Timur Shenkao
Hello guys, Spark benefits from stable versions not frequent ones. A lot of people still have 1.6.x in production. Those who wants the freshest (like me) can always deploy night builts. My question is: how long version 1.6 will be supported? On Sunday, March 19, 2017, Holden Karau

Re: Should we consider a Spark 2.1.1 release?

2017-03-19 Thread Holden Karau
This discussions seems like it might benefit from its own thread as we've previously decided to lengthen release cycles but if their are different opinions about this it seems unrelated to the specific 2.1.1 release. On Sun, Mar 19, 2017 at 2:57 PM Jacek Laskowski wrote: > Hi

Re: Should we consider a Spark 2.1.1 release?

2017-03-19 Thread Jacek Laskowski
Hi Mark, I appreciate your comment. My thinking is that the more frequent minor and patch releases the more often end users can give them a shot and be part of the bigger release cycle for major releases. Spark's an OSS project and we all can make mistakes and my thinking is is that the more

Re: Should we consider a Spark 2.1.1 release?

2017-03-19 Thread Mark Hamstra
That doesn't necessarily follow, Jacek. There is a point where too frequent releases decrease quality. That is because releases don't come for free -- each one demands a considerable amount of time from release managers, testers, etc. -- time that would otherwise typically be devoted to improving

Re: Should we consider a Spark 2.1.1 release?

2017-03-19 Thread Jacek Laskowski
+1 More smaller and more frequent releases (so major releases get even more quality). Jacek On 13 Mar 2017 8:07 p.m., "Holden Karau" wrote: > Hi Spark Devs, > > Spark 2.1 has been out since end of December >

Re: Should we consider a Spark 2.1.1 release?

2017-03-16 Thread Nick Pentreath
Spark 1.5.1 had 87 issues fix version 1 month after 1.5.0. Spark 1.6.1 had 123 issues 2 months after 1.6.0 2.0.1 was larger (317 issues) at 3 months after 2.0.0 - makes sense due to how large a release it was. We are at 185 for 2.1.1 and 3 months after (and not released yet so it could slip

Re: Should we consider a Spark 2.1.1 release?

2017-03-15 Thread Michael Armbrust
Hey Holden, Thanks for bringing this up! I think we usually cut patch releases when there are enough fixes to justify it. Sometimes just a few weeks after the release. I guess if we are at 3 months Spark 2.1.0 was a pretty good release :) That said, it is probably time. I was about to start

Re: Should we consider a Spark 2.1.1 release?

2017-03-13 Thread Holden Karau
I'd be happy to do the work of coordinating a 2.1.1 release if that's a thing a committer can do (I think the release coordinator for the most recent Arrow release was a committer and the final publish step took a PMC member to upload but other than that I don't remember any issues). On Mon, Mar

Re: Should we consider a Spark 2.1.1 release?

2017-03-13 Thread Sean Owen
It seems reasonable to me, in that other x.y.1 releases have followed ~2 months after the x.y.0 release and it's been about 3 months since 2.1.0. Related: creating releases is tough work, so I feel kind of bad voting for someone else to do that much work. Would it make sense to deputize another

Re: Should we consider a Spark 2.1.1 release?

2017-03-13 Thread Felix Cheung
uld we consider a Spark 2.1.1 release? Hi Spark Devs, Spark 2.1 has been out since end of December<http://apache-spark-developers-list.1001551.n3.nabble.com/ANNOUNCE-Announcing-Apache-Spark-2-1-0-td20390.html> and we've got quite a few fixes merged for 2.1.1<https://issues.apache.org/ji

Should we consider a Spark 2.1.1 release?

2017-03-13 Thread Holden Karau
Hi Spark Devs, Spark 2.1 has been out since end of December and we've got quite a few fixes merged for 2.1.1