Hi All,

The voting for Spark 3.3.1 RC1 has failed and I will prepare RC2 soon.


On Mon, Sep 19, 2022 at 8:53 AM Dongjoon Hyun <dongjoon.h...@gmail.com>
wrote:

> I also agree with Chao on that issue.
>
> SPARK-39833 landed at 3.3.1 and 3.2.3 to avoid a correctness issue at the
> cost of perf regression.
> Luckily, SPARK-40169 provided a correct fix and removed the main
> workaround code of SPARK-39833 before the official release.
>
> -1 for Apache Spark 3.3.1 RC1.
>
> Dongjoon.
>
>
> On Sun, Sep 18, 2022 at 10:08 AM Chao Sun <sunc...@apache.org> wrote:
>
>> It'd be really nice if we can include
>> https://issues.apache.org/jira/browse/SPARK-40169 in this release,
>> since otherwise it'll introduce a perf regression with Parquet column
>> index disabled.
>>
>> On Sat, Sep 17, 2022 at 2:08 PM Sean Owen <sro...@apache.org> wrote:
>> >
>> > +1 LGTM. I tested Scala 2.13 + Java 11 on Ubuntu 22.04. I get the same
>> results as usual.
>> >
>> > On Sat, Sep 17, 2022 at 2:42 AM Yuming Wang <wgy...@gmail.com> wrote:
>> >>
>> >> Please vote on releasing the following candidate as Apache Spark
>> version 3.3.1.
>> >>
>> >> The vote is open until 11:59pm Pacific time September 22th and passes
>> if a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >>
>> >> [ ] +1 Release this package as Apache Spark 3.3.1
>> >> [ ] -1 Do not release this package because ...
>> >>
>> >> To learn more about Apache Spark, please see https://spark.apache.org
>> >>
>> >> The tag to be voted on is v3.3.1-rc1 (commit
>> ea1a426a889626f1ee1933e3befaa975a2f0a072):
>> >> https://github.com/apache/spark/tree/v3.3.1-rc1
>> >>
>> >> The release files, including signatures, digests, etc. can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.1-rc1-bin
>> >>
>> >> Signatures used for Spark RCs can be found in this file:
>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >>
>> >> The staging repository for this release can be found at:
>> >> https://repository.apache.org/content/repositories/orgapachespark-1418
>> >>
>> >> The documentation corresponding to this release can be found at:
>> >> https://dist.apache.org/repos/dist/dev/spark/v3.3.1-rc1-docs
>> >>
>> >> The list of bug fixes going into 3.3.1 can be found at the following
>> URL:
>> >> https://issues.apache.org/jira/projects/SPARK/versions/12351710
>> >>
>> >> This release is using the release script of the tag v3.3.1-rc1.
>> >>
>> >>
>> >> FAQ
>> >>
>> >> =========================
>> >> How can I help test this release?
>> >> =========================
>> >> If you are a Spark user, you can help us test this release by taking
>> >> an existing Spark workload and running on this release candidate, then
>> >> reporting any regressions.
>> >>
>> >> If you're working in PySpark you can set up a virtual env and install
>> >> the current RC and see if anything important breaks, in the Java/Scala
>> >> you can add the staging repository to your projects resolvers and test
>> >> with the RC (make sure to clean up the artifact cache before/after so
>> >> you don't end up building with a out of date RC going forward).
>> >>
>> >> ===========================================
>> >> What should happen to JIRA tickets still targeting 3.3.1?
>> >> ===========================================
>> >> The current list of open tickets targeted at 3.3.1 can be found at:
>> >> https://issues.apache.org/jira/projects/SPARK and search for "Target
>> Version/s" = 3.3.1
>> >>
>> >> Committers should look at those and triage. Extremely important bug
>> >> fixes, documentation, and API tweaks that impact compatibility should
>> >> be worked on immediately. Everything else please retarget to an
>> >> appropriate release.
>> >>
>> >> ==================
>> >> But my bug isn't fixed?
>> >> ==================
>> >> In order to make timely releases, we will typically not hold the
>> >> release unless the bug in question is a regression from the previous
>> >> release. That being said, if there is something which is a regression
>> >> that has not been correctly targeted please ping me or a committer to
>> >> help target the issue.
>> >>
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to