Thank you everyone for your participation.

The vote is open until June 23rd 1AM (PST) and I'll conclude this vote
after that.

Dongjoon.



On Thu, Jun 22, 2023 at 8:29 AM Chao Sun <sunc...@apache.org> wrote:

> +1
>
> On Thu, Jun 22, 2023 at 6:52 AM Yuming Wang <yumw...@apache.org> wrote:
> >
> > +1.
> >
> > On Thu, Jun 22, 2023 at 4:41 PM Jacek Laskowski <ja...@japila.pl> wrote:
> >>
> >> +1
> >>
> >> Builds and runs fine on Java 17, macOS.
> >>
> >> $ ./dev/change-scala-version.sh 2.13
> >> $ mvn \
> >>
> -Pkubernetes,hadoop-cloud,hive,hive-thriftserver,scala-2.13,volcano,connect
> \
> >> -DskipTests \
> >> clean install
> >>
> >> $ python/run-tests --parallelism=1 --testnames 'pyspark.sql.session
> SparkSession.sql'
> >> ...
> >> Tests passed in 28 second
> >>
> >> Pozdrawiam,
> >> Jacek Laskowski
> >> ----
> >> "The Internals Of" Online Books
> >> Follow me on https://twitter.com/jaceklaskowski
> >>
> >>
> >>
> >> On Tue, Jun 20, 2023 at 4:41 AM Dongjoon Hyun <dongj...@apache.org>
> wrote:
> >>>
> >>> Please vote on releasing the following candidate as Apache Spark
> version 3.4.1.
> >>>
> >>> The vote is open until June 23rd 1AM (PST) and passes if a majority +1
> PMC votes are cast, with a minimum of 3 +1 votes.
> >>>
> >>> [ ] +1 Release this package as Apache Spark 3.4.1
> >>> [ ] -1 Do not release this package because ...
> >>>
> >>> To learn more about Apache Spark, please see https://spark.apache.org/
> >>>
> >>> The tag to be voted on is v3.4.1-rc1 (commit
> 6b1ff22dde1ead51cbf370be6e48a802daae58b6)
> >>> https://github.com/apache/spark/tree/v3.4.1-rc1
> >>>
> >>> The release files, including signatures, digests, etc. can be found at:
> >>> https://dist.apache.org/repos/dist/dev/spark/v3.4.1-rc1-bin/
> >>>
> >>> Signatures used for Spark RCs can be found in this file:
> >>> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >>>
> >>> The staging repository for this release can be found at:
> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1443/
> >>>
> >>> The documentation corresponding to this release can be found at:
> >>> https://dist.apache.org/repos/dist/dev/spark/v3.4.1-rc1-docs/
> >>>
> >>> The list of bug fixes going into 3.4.1 can be found at the following
> URL:
> >>> https://issues.apache.org/jira/projects/SPARK/versions/12352874
> >>>
> >>> This release is using the release script of the tag v3.4.1-rc1.
> >>>
> >>> FAQ
> >>>
> >>> =========================
> >>> How can I help test this release?
> >>> =========================
> >>>
> >>> If you are a Spark user, you can help us test this release by taking
> >>> an existing Spark workload and running on this release candidate, then
> >>> reporting any regressions.
> >>>
> >>> If you're working in PySpark you can set up a virtual env and install
> >>> the current RC and see if anything important breaks, in the Java/Scala
> >>> you can add the staging repository to your projects resolvers and test
> >>> with the RC (make sure to clean up the artifact cache before/after so
> >>> you don't end up building with a out of date RC going forward).
> >>>
> >>> ===========================================
> >>> What should happen to JIRA tickets still targeting 3.4.1?
> >>> ===========================================
> >>>
> >>> The current list of open tickets targeted at 3.4.1 can be found at:
> >>> https://issues.apache.org/jira/projects/SPARK and search for "Target
> Version/s" = 3.4.1
> >>>
> >>> Committers should look at those and triage. Extremely important bug
> >>> fixes, documentation, and API tweaks that impact compatibility should
> >>> be worked on immediately. Everything else please retarget to an
> >>> appropriate release.
> >>>
> >>> ==================
> >>> But my bug isn't fixed?
> >>> ==================
> >>>
> >>> In order to make timely releases, we will typically not hold the
> >>> release unless the bug in question is a regression from the previous
> >>> release. That being said, if there is something which is a regression
> >>> that has not been correctly targeted please ping me or a committer to
> >>> help target the issue.
>

Reply via email to