Re: Spark 3.0 branch cut and code freeze on Jan 31?

2019-12-24 Thread Jungtaek Lim
Jan 31 sounds good to me. Just curious, do we allow some exception on code freeze? One thing came into my mind is that some feature could have multiple subtasks and part of subtasks have been merged and other subtask(s) are in reviewing. In this case do we allow these subtasks to have more days

Re: [ANNOUNCE] Announcing Apache Spark 3.0.0-preview2

2019-12-24 Thread Jungtaek Lim
Great work, Yuming! Happy Holidays. On Wed, Dec 25, 2019 at 9:08 AM Dongjoon Hyun wrote: > Indeed! Thank you again, Yuming and all. > > Bests, > Dongjoon. > > > On Tue, Dec 24, 2019 at 13:38 Takeshi Yamamuro > wrote: > >> Great work, Yuming! >> >> Bests, >> Takeshi >> >> On Wed, Dec 25, 2019

Re: [ANNOUNCE] Announcing Apache Spark 3.0.0-preview2

2019-12-24 Thread Dongjoon Hyun
Indeed! Thank you again, Yuming and all. Bests, Dongjoon. On Tue, Dec 24, 2019 at 13:38 Takeshi Yamamuro wrote: > Great work, Yuming! > > Bests, > Takeshi > > On Wed, Dec 25, 2019 at 6:00 AM Xiao Li wrote: > >> Thank you all. Happy Holidays! >> >> Xiao >> >> On Tue, Dec 24, 2019 at 12:53 PM

Re: [ANNOUNCE] Announcing Apache Spark 3.0.0-preview2

2019-12-24 Thread Takeshi Yamamuro
Great work, Yuming! Bests, Takeshi On Wed, Dec 25, 2019 at 6:00 AM Xiao Li wrote: > Thank you all. Happy Holidays! > > Xiao > > On Tue, Dec 24, 2019 at 12:53 PM Yuming Wang wrote: > >> Hi all, >> >> To enable wide-scale community testing of the upcoming Spark 3.0 release, >> the Apache Spark

Re: Spark 3.0 branch cut and code freeze on Jan 31?

2019-12-24 Thread Takeshi Yamamuro
Looks nice, happy holiday, all! Bests, Takeshi On Wed, Dec 25, 2019 at 3:56 AM Dongjoon Hyun wrote: > +1 for January 31st. > > Bests, > Dongjoon. > > On Tue, Dec 24, 2019 at 7:11 AM Xiao Li wrote: > >> Jan 31 is pretty reasonable. Happy Holidays! >> >> Xiao >> >> On Tue, Dec 24, 2019 at 5:52

Re: [ANNOUNCE] Announcing Apache Spark 3.0.0-preview2

2019-12-24 Thread Xiao Li
Thank you all. Happy Holidays! Xiao On Tue, Dec 24, 2019 at 12:53 PM Yuming Wang wrote: > Hi all, > > To enable wide-scale community testing of the upcoming Spark 3.0 release, > the Apache Spark community has posted a new preview release of Spark 3.0. > This preview is *not a stable release in

[ANNOUNCE] Announcing Apache Spark 3.0.0-preview2

2019-12-24 Thread Yuming Wang
Hi all, To enable wide-scale community testing of the upcoming Spark 3.0 release, the Apache Spark community has posted a new preview release of Spark 3.0. This preview is *not a stable release in terms of either API or functionality*, but it is meant to give the community early access to try the

Re: Spark 3.0 branch cut and code freeze on Jan 31?

2019-12-24 Thread Dongjoon Hyun
+1 for January 31st. Bests, Dongjoon. On Tue, Dec 24, 2019 at 7:11 AM Xiao Li wrote: > Jan 31 is pretty reasonable. Happy Holidays! > > Xiao > > On Tue, Dec 24, 2019 at 5:52 AM Sean Owen wrote: > >> Yep, always happens. Is earlier realistic, like Jan 15? it's all >> arbitrary but indeed this

Re: Spark 3.0 branch cut and code freeze on Jan 31?

2019-12-24 Thread Xiao Li
Jan 31 is pretty reasonable. Happy Holidays! Xiao On Tue, Dec 24, 2019 at 5:52 AM Sean Owen wrote: > Yep, always happens. Is earlier realistic, like Jan 15? it's all arbitrary > but indeed this has been in progress for a while, and there's a downside to > not releasing it, to making the gap to

Re: Spark 3.0 branch cut and code freeze on Jan 31?

2019-12-24 Thread Sean Owen
Yep, always happens. Is earlier realistic, like Jan 15? it's all arbitrary but indeed this has been in progress for a while, and there's a downside to not releasing it, to making the gap to 3.0 larger. On my end I don't know of anything that's holding up a release; is it basically DSv2? BTW these