FYI, currently we have one block issue (
https://issues.apache.org/jira/browse/SPARK-24535), will start the release
after this is fixed.

Also please let me know if there're other blocks or fixes want to land to
2.3.2 release.

Thanks
Saisai

Saisai Shao <sai.sai.s...@gmail.com> 于2018年7月2日周一 下午1:16写道:

> I will start preparing the release.
>
> Thanks
>
> John Zhuge <jzh...@apache.org> 于2018年6月30日周六 上午10:31写道:
>
>> +1  Looking forward to the critical fixes in 2.3.2.
>>
>> On Thu, Jun 28, 2018 at 9:37 AM Ryan Blue <rb...@netflix.com.invalid>
>> wrote:
>>
>>> +1
>>>
>>> On Thu, Jun 28, 2018 at 9:34 AM Xiao Li <gatorsm...@gmail.com> wrote:
>>>
>>>> +1. Thanks, Saisai!
>>>>
>>>> The impact of SPARK-24495 is large. We should release Spark 2.3.2 ASAP.
>>>>
>>>> Thanks,
>>>>
>>>> Xiao
>>>>
>>>> 2018-06-27 23:28 GMT-07:00 Takeshi Yamamuro <linguin....@gmail.com>:
>>>>
>>>>> +1, I heard some Spark users have skipped v2.3.1 because of these bugs.
>>>>>
>>>>> On Thu, Jun 28, 2018 at 3:09 PM Xingbo Jiang <jiangxb1...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> +1
>>>>>>
>>>>>> Wenchen Fan <cloud0...@gmail.com>于2018年6月28日 周四下午2:06写道:
>>>>>>
>>>>>>> Hi Saisai, that's great! please go ahead!
>>>>>>>
>>>>>>> On Thu, Jun 28, 2018 at 12:56 PM Saisai Shao <sai.sai.s...@gmail.com>
>>>>>>> wrote:
>>>>>>>
>>>>>>>> +1, like mentioned by Marcelo, these issues seems quite severe.
>>>>>>>>
>>>>>>>> I can work on the release if short of hands :).
>>>>>>>>
>>>>>>>> Thanks
>>>>>>>> Jerry
>>>>>>>>
>>>>>>>>
>>>>>>>> Marcelo Vanzin <van...@cloudera.com.invalid> 于2018年6月28日周四
>>>>>>>> 上午11:40写道:
>>>>>>>>
>>>>>>>>> +1. SPARK-24589 / SPARK-24552 are kinda nasty and we should get
>>>>>>>>> fixes
>>>>>>>>> for those out.
>>>>>>>>>
>>>>>>>>> (Those are what delayed 2.2.2 and 2.1.3 for those watching...)
>>>>>>>>>
>>>>>>>>> On Wed, Jun 27, 2018 at 7:59 PM, Wenchen Fan <cloud0...@gmail.com>
>>>>>>>>> wrote:
>>>>>>>>> > Hi all,
>>>>>>>>> >
>>>>>>>>> > Spark 2.3.1 was released just a while ago, but unfortunately we
>>>>>>>>> discovered
>>>>>>>>> > and fixed some critical issues afterward.
>>>>>>>>> >
>>>>>>>>> > SPARK-24495: SortMergeJoin may produce wrong result.
>>>>>>>>> > This is a serious correctness bug, and is easy to hit: have
>>>>>>>>> duplicated join
>>>>>>>>> > key from the left table, e.g. `WHERE t1.a = t2.b AND t1.a =
>>>>>>>>> t2.c`, and the
>>>>>>>>> > join is a sort merge join. This bug is only present in Spark 2.3.
>>>>>>>>> >
>>>>>>>>> > SPARK-24588: stream-stream join may produce wrong result
>>>>>>>>> > This is a correctness bug in a new feature of Spark 2.3: the
>>>>>>>>> stream-stream
>>>>>>>>> > join. Users can hit this bug if one of the join side is
>>>>>>>>> partitioned by a
>>>>>>>>> > subset of the join keys.
>>>>>>>>> >
>>>>>>>>> > SPARK-24552: Task attempt numbers are reused when stages are
>>>>>>>>> retried
>>>>>>>>> > This is a long-standing bug in the output committer that may
>>>>>>>>> introduce data
>>>>>>>>> > corruption.
>>>>>>>>> >
>>>>>>>>> > SPARK-24542: UDFXPathXXXX allow users to pass carefully crafted
>>>>>>>>> XML to
>>>>>>>>> > access arbitrary files
>>>>>>>>> > This is a potential security issue if users build access control
>>>>>>>>> module upon
>>>>>>>>> > Spark.
>>>>>>>>> >
>>>>>>>>> > I think we need a Spark 2.3.2 to address these issues(especially
>>>>>>>>> the
>>>>>>>>> > correctness bugs) ASAP. Any thoughts?
>>>>>>>>> >
>>>>>>>>> > Thanks,
>>>>>>>>> > Wenchen
>>>>>>>>>
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Marcelo
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> ---------------------------------------------------------------------
>>>>>>>>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>>>>>>>>
>>>>>>>>>
>>>>>
>>>>> --
>>>>> ---
>>>>> Takeshi Yamamuro
>>>>>
>>>>
>>>>
>>>
>>> --
>>> Ryan Blue
>>> Software Engineer
>>> Netflix
>>>
>>> --
>>> John Zhuge
>>>
>>

Reply via email to