Thank you Dongjoon for being a release manager. If the assumed dates are ok, I would like to volunteer for an 2.3.4 release manager.
Best Regards, Kazuaki Ishizaki, From: Dongjoon Hyun <dongjoon.h...@gmail.com> To: dev <dev@spark.apache.org>, "user @spark" <u...@spark.apache.org>, Apache Spark PMC <priv...@spark.apache.org> Date: 2019/07/13 07:18 Subject: [EXTERNAL] Re: Release Apache Spark 2.4.4 before 3.0.0 Thank you, Jacek. BTW, I added `@private` since we need PMC's help to make an Apache Spark release. Can I get more feedbacks from the other PMC members? Please me know if you have any concerns (e.g. Release date or Release manager?) As one of the community members, I assumed the followings (if we are on schedule). - 2.4.4 at the end of July - 2.3.4 at the end of August (since 2.3.0 was released at the end of February 2018) - 3.0.0 (possibily September?) - 3.1.0 (January 2020?) Bests, Dongjoon. On Thu, Jul 11, 2019 at 1:30 PM Jacek Laskowski <ja...@japila.pl> wrote: Hi, Thanks Dongjoon Hyun for stepping up as a release manager! Much appreciated. If there's a volunteer to cut a release, I'm always to support it. In addition, the more frequent releases the better for end users so they have a choice to upgrade and have all the latest fixes or wait. It's their call not ours (when we'd keep them waiting). My big 2 yes'es for the release! Jacek On Tue, 9 Jul 2019, 18:15 Dongjoon Hyun, <dongjoon.h...@gmail.com> wrote: Hi, All. Spark 2.4.3 was released two months ago (8th May). As of today (9th July), there exist 45 fixes in `branch-2.4` including the following correctness or blocker issues. - SPARK-26038 Decimal toScalaBigInt/toJavaBigInteger not work for decimals not fitting in long - SPARK-26045 Error in the spark 2.4 release package with the spark-avro_2.11 dependency - SPARK-27798 from_avro can modify variables in other rows in local mode - SPARK-27907 HiveUDAF should return NULL in case of 0 rows - SPARK-28157 Make SHS clear KVStore LogInfo for the blacklist entries - SPARK-28308 CalendarInterval sub-second part should be padded before parsing It would be great if we can have Spark 2.4.4 before we are going to get busier for 3.0.0. If it's okay, I'd like to volunteer for an 2.4.4 release manager to roll it next Monday. (15th July). How do you think about this? Bests, Dongjoon.