I expect to see RC2 too. I guess he just sticks to the standard, leaving
the vote open till the end.
It hasn't got enough +1s anyway :-).

On Wed, 11 May 2022 at 10:17, Holden Karau <hol...@pigscanfly.ca> wrote:

> Technically release don't follow vetos (see
> https://www.apache.org/foundation/voting.html ) it's up to the RM if they
> get the minimum number of binding +1s (although they are encouraged to
> cancel the release if any serious issues are raised).
>
> That being said I'll add my -1 based on the issues reported in this thread.
>
> On Tue, May 10, 2022 at 6:07 PM Sean Owen <sro...@gmail.com> wrote:
>
>> There's a -1 vote here, so I think this RC fails anyway.
>>
>> On Fri, May 6, 2022 at 10:30 AM Gengliang Wang <ltn...@gmail.com> wrote:
>>
>>> Hi Maxim,
>>>
>>> Thanks for the work!
>>> There is a bug fix from Bruce merged on branch-3.3 right after the RC1
>>> is cut:
>>> SPARK-39093: Dividing interval by integral can result in codegen
>>> compilation error
>>> <https://github.com/apache/spark/commit/fd998c8a6783c0c8aceed8dcde4017cd479e42c8>
>>>
>>> So -1 from me. We should have RC2 to include the fix.
>>>
>>> Thanks
>>> Gengliang
>>>
>>> On Fri, May 6, 2022 at 6:15 PM Maxim Gekk
>>> <maxim.g...@databricks.com.invalid> wrote:
>>>
>>>> Hi Dongjoon,
>>>>
>>>>  > https://issues.apache.org/jira/projects/SPARK/versions/12350369
>>>> > Since RC1 is started, could you move them out from the 3.3.0
>>>> milestone?
>>>>
>>>> I have removed the 3.3.0 label from Fix version(s). Thank you, Dongjoon.
>>>>
>>>> Maxim Gekk
>>>>
>>>> Software Engineer
>>>>
>>>> Databricks, Inc.
>>>>
>>>>
>>>> On Fri, May 6, 2022 at 11:06 AM Dongjoon Hyun <dongjoon.h...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi, Sean.
>>>>> It's interesting. I didn't see those failures from my side.
>>>>>
>>>>> Hi, Maxim.
>>>>> In the following link, there are 17 in-progress and 6 to-do JIRA
>>>>> issues which look irrelevant to this RC1 vote.
>>>>>
>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12350369
>>>>>
>>>>> Since RC1 is started, could you move them out from the 3.3.0 milestone?
>>>>> Otherwise, we cannot distinguish new real blocker issues from those
>>>>> obsolete JIRA issues.
>>>>>
>>>>> Thanks,
>>>>> Dongjoon.
>>>>>
>>>>>
>>>>> On Thu, May 5, 2022 at 11:46 AM Adam Binford <adam...@gmail.com>
>>>>> wrote:
>>>>>
>>>>>> I looked back at the first one (SPARK-37618), it expects/assumes a
>>>>>> 0022 umask to correctly test the behavior. I'm not sure how to get that 
>>>>>> to
>>>>>> not fail or be ignored with a more open umask.
>>>>>>
>>>>>> On Thu, May 5, 2022 at 1:56 PM Sean Owen <sro...@gmail.com> wrote:
>>>>>>
>>>>>>> I'm seeing test failures; is anyone seeing ones like this? This is
>>>>>>> Java 8 / Scala 2.12 / Ubuntu 22.04:
>>>>>>>
>>>>>>> - SPARK-37618: Sub dirs are group writable when removing from
>>>>>>> shuffle service enabled *** FAILED ***
>>>>>>>   [OWNER_WRITE, GROUP_READ, GROUP_WRITE, GROUP_EXECUTE, OTHERS_READ,
>>>>>>> OWNER_READ, OTHERS_EXECUTE, OWNER_EXECUTE] contained GROUP_WRITE
>>>>>>> (DiskBlockManagerSuite.scala:155)
>>>>>>>
>>>>>>> - Check schemas for expression examples *** FAILED ***
>>>>>>>   396 did not equal 398 Expected 396 blocks in result file but got
>>>>>>> 398. Try regenerating the result files. 
>>>>>>> (ExpressionsSchemaSuite.scala:161)
>>>>>>>
>>>>>>>  Function 'bloom_filter_agg', Expression class
>>>>>>> 'org.apache.spark.sql.catalyst.expressions.aggregate.BloomFilterAggregate'
>>>>>>> "" did not start with "
>>>>>>>       Examples:
>>>>>>>   " (ExpressionInfoSuite.scala:142)
>>>>>>>
>>>>>>> On Thu, May 5, 2022 at 6:01 AM Maxim Gekk
>>>>>>> <maxim.g...@databricks.com.invalid> wrote:
>>>>>>>
>>>>>>>> Please vote on releasing the following candidate as Apache Spark
>>>>>>>>  version 3.3.0.
>>>>>>>>
>>>>>>>> The vote is open until 11:59pm Pacific time May 10th and passes if
>>>>>>>> a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>>>>>>>>
>>>>>>>> [ ] +1 Release this package as Apache Spark 3.3.0
>>>>>>>> [ ] -1 Do not release this package because ...
>>>>>>>>
>>>>>>>> To learn more about Apache Spark, please see http://spark
>>>>>>>> .apache.org/
>>>>>>>>
>>>>>>>> The tag to be voted on is v3.3.0-rc1 (commit
>>>>>>>> 482b7d54b522c4d1e25f3e84eabbc78126f22a3d):
>>>>>>>> https://github.com/apache/spark/tree/v3.3.0-rc1
>>>>>>>>
>>>>>>>> The release files, including signatures, digests, etc. can be
>>>>>>>> found at:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.3.0-rc1-bin/
>>>>>>>>
>>>>>>>> Signatures used for Spark RCs can be found in this file:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>>>>>>>>
>>>>>>>> The staging repository for this release can be found at:
>>>>>>>>
>>>>>>>> https://repository.apache.org/content/repositories/orgapachespark-1402
>>>>>>>>
>>>>>>>> The documentation corresponding to this release can be found at:
>>>>>>>> https://dist.apache.org/repos/dist/dev/spark/v3.3.0-rc1-docs/
>>>>>>>>
>>>>>>>> The list of bug fixes going into 3.3.0 can be found at the
>>>>>>>> following URL:
>>>>>>>> https://issues.apache.org/jira/projects/SPARK/versions/12350369
>>>>>>>>
>>>>>>>> This release is using the release script of the tag v3.3.0-rc1.
>>>>>>>>
>>>>>>>>
>>>>>>>> FAQ
>>>>>>>>
>>>>>>>> =========================
>>>>>>>> How can I help test this release?
>>>>>>>> =========================
>>>>>>>> If you are a Spark user, you can help us test this release by
>>>>>>>> taking
>>>>>>>> an existing Spark workload and running on this release candidate,
>>>>>>>> then
>>>>>>>> reporting any regressions.
>>>>>>>>
>>>>>>>> If you're working in PySpark you can set up a virtual env and
>>>>>>>> install
>>>>>>>> the current RC and see if anything important breaks, in the
>>>>>>>> Java/Scala
>>>>>>>> you can add the staging repository to your projects resolvers and
>>>>>>>> test
>>>>>>>> with the RC (make sure to clean up the artifact cache before/after
>>>>>>>> so
>>>>>>>> you don't end up building with a out of date RC going forward).
>>>>>>>>
>>>>>>>> ===========================================
>>>>>>>> What should happen to JIRA tickets still targeting 3.3.0?
>>>>>>>> ===========================================
>>>>>>>> The current list of open tickets targeted at 3.3.0 can be found at:
>>>>>>>> https://issues.apache.org/jira/projects/SPARK and search for
>>>>>>>> "Target Version/s" = 3.3.0
>>>>>>>>
>>>>>>>> Committers should look at those and triage. Extremely important bug
>>>>>>>> fixes, documentation, and API tweaks that impact compatibility
>>>>>>>> should
>>>>>>>> be worked on immediately. Everything else please retarget to an
>>>>>>>> appropriate release.
>>>>>>>>
>>>>>>>> ==================
>>>>>>>> But my bug isn't fixed?
>>>>>>>> ==================
>>>>>>>> In order to make timely releases, we will typically not hold the
>>>>>>>> release unless the bug in question is a regression from the
>>>>>>>> previous
>>>>>>>> release. That being said, if there is something which is a
>>>>>>>> regression
>>>>>>>> that has not been correctly targeted please ping me or a committer
>>>>>>>> to
>>>>>>>> help target the issue.
>>>>>>>>
>>>>>>>> Maxim Gekk
>>>>>>>>
>>>>>>>> Software Engineer
>>>>>>>>
>>>>>>>> Databricks, Inc.
>>>>>>>>
>>>>>>>
>>>>>>
>>>>>> --
>>>>>> Adam Binford
>>>>>>
>>>>>
>
> --
> Twitter: https://twitter.com/holdenkarau
> Books (Learning Spark, High Performance Spark, etc.):
> https://amzn.to/2MaRAG9  <https://amzn.to/2MaRAG9>
> YouTube Live Streams: https://www.youtube.com/user/holdenkarau
>

Reply via email to