+1 on a 3.0.1 soon.

Probably it would be nice if some Scala experts can take a look at
https://issues.apache.org/jira/browse/SPARK-32051 and include the fix into
3.0.1 if possible.
Looks like APIs designed to work with Scala 2.11 & Java bring ambiguity in
Scala 2.12 & Java.

On Wed, Jun 24, 2020 at 4:52 AM Jules Damji <dmat...@comcast.net> wrote:

> +1 (non-binding)
>
> Sent from my iPhone
> Pardon the dumb thumb typos :)
>
> On Jun 23, 2020, at 11:36 AM, Holden Karau <hol...@pigscanfly.ca> wrote:
>
> 
> +1 on a patch release soon
>
> On Tue, Jun 23, 2020 at 10:47 AM Reynold Xin <r...@databricks.com> wrote:
>
>> +1 on doing a new patch release soon. I saw some of these issues when
>> preparing the 3.0 release, and some of them are very serious.
>>
>>
>> On Tue, Jun 23, 2020 at 8:06 AM, Shivaram Venkataraman <
>> shiva...@eecs.berkeley.edu> wrote:
>>
>>> +1 Thanks Yuanjian -- I think it'll be great to have a 3.0.1 release
>>> soon.
>>>
>>> Shivaram
>>>
>>> On Tue, Jun 23, 2020 at 3:43 AM Takeshi Yamamuro <linguin....@gmail.com>
>>> wrote:
>>>
>>> Thanks for the heads-up, Yuanjian!
>>>
>>> I also noticed branch-3.0 already has 39 commits after Spark 3.0.0.
>>>
>>> wow, the updates are so quick. Anyway, +1 for the release.
>>>
>>> Bests,
>>> Takeshi
>>>
>>> On Tue, Jun 23, 2020 at 4:59 PM Yuanjian Li <xyliyuanj...@gmail.com>
>>> wrote:
>>>
>>> Hi dev-list,
>>>
>>> I’m writing this to raise the discussion about Spark 3.0.1 feasibility
>>> since 4 blocker issues were found after Spark 3.0.0:
>>>
>>> [SPARK-31990] The state store compatibility broken will cause a
>>> correctness issue when Streaming query with `dropDuplicate` uses the
>>> checkpoint written by the old Spark version.
>>>
>>> [SPARK-32038] The regression bug in handling NaN values in
>>> COUNT(DISTINCT)
>>>
>>> [SPARK-31918][WIP] CRAN requires to make it working with the latest R
>>> 4.0. It makes the 3.0 release unavailable on CRAN, and only supports R
>>> [3.5, 4.0)
>>>
>>> [SPARK-31967] Downgrade vis.js to fix Jobs UI loading time regression
>>>
>>> I also noticed branch-3.0 already has 39 commits after Spark 3.0.0. I
>>> think it would be great if we have Spark 3.0.1 to deliver the critical
>>> fixes.
>>>
>>> Any comments are appreciated.
>>>
>>> Best,
>>>
>>> Yuanjian
>>>
>>> --
>>> ---
>>> Takeshi Yamamuro
>>>
>>> --------------------------------------------------------------------- To
>>> unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>
>>
>>
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>
>

Reply via email to