+1

XiDuo You <ulyssesyo...@gmail.com> ezt írta (időpont: 2024. júl. 30., K,
7:56):

> +1
>
> Zhou Jiang <zhou.c.ji...@gmail.com> 于2024年7月30日周二 02:08写道:
> >
> > +1 (non-binding)
> >
> > Zhou JIANG
> >
> >
> >
> > On Mon, Jul 29, 2024 at 11:06 L. C. Hsieh <vii...@gmail.com> wrote:
> >>
> >> +1
> >>
> >> On Mon, Jul 29, 2024 at 7:33 AM Wenchen Fan <cloud0...@gmail.com>
> wrote:
> >> >
> >> > +1
> >> >
> >> > On Sat, Jul 27, 2024 at 10:03 AM Dongjoon Hyun <
> dongjoon.h...@gmail.com> wrote:
> >> >>
> >> >> +1
> >> >>
> >> >> Thank you, Kent.
> >> >>
> >> >> Dongjoon.
> >> >>
> >> >> On Fri, Jul 26, 2024 at 6:37 AM Kent Yao <y...@apache.org> wrote:
> >> >>>
> >> >>> Hi dev,
> >> >>>
> >> >>> Please vote on releasing the following candidate as Apache Spark
> version 3.5.2.
> >> >>>
> >> >>> The vote is open until Jul 29, 14:00:00 UTC, and passes if a
> majority +1
> >> >>> PMC votes are cast, with a minimum of 3 +1 votes.
> >> >>>
> >> >>> [ ] +1 Release this package as Apache Spark 3.5.2
> >> >>> [ ] -1 Do not release this package because ...
> >> >>>
> >> >>> To learn more about Apache Spark, please see
> https://spark.apache.org/
> >> >>>
> >> >>> The tag to be voted on is v3.5.2-rc4 (commit
> >> >>> 1edbddfadeb46581134fa477d35399ddc63b7163):
> >> >>> https://github.com/apache/spark/tree/v3.5.2-rc4
> >> >>>
> >> >>> The release files, including signatures, digests, etc. can be found
> at:
> >> >>> https://dist.apache.org/repos/dist/dev/spark/v3.5.2-rc4-bin/
> >> >>>
> >> >>> Signatures used for Spark RCs can be found in this file:
> >> >>> https://dist.apache.org/repos/dist/dev/spark/KEYS
> >> >>>
> >> >>> The staging repository for this release can be found at:
> >> >>>
> https://repository.apache.org/content/repositories/orgapachespark-1460/
> >> >>>
> >> >>> The documentation corresponding to this release can be found at:
> >> >>> https://dist.apache.org/repos/dist/dev/spark/v3.5.2-rc4-docs/
> >> >>>
> >> >>> The list of bug fixes going into 3.5.2 can be found at the
> following URL:
> >> >>> https://issues.apache.org/jira/projects/SPARK/versions/12353980
> >> >>>
> >> >>> FAQ
> >> >>>
> >> >>> =========================
> >> >>> How can I help test this release?
> >> >>> =========================
> >> >>>
> >> >>> If you are a Spark user, you can help us test this release by taking
> >> >>> an existing Spark workload and running on this release candidate,
> then
> >> >>> reporting any regressions.
> >> >>>
> >> >>> If you're working in PySpark you can set up a virtual env and
> install
> >> >>> the current RC via "pip install
> >> >>>
> https://dist.apache.org/repos/dist/dev/spark/v3.5.2-rc4-bin/pyspark-3.5.2.tar.gz
> "
> >> >>> and see if anything important breaks.
> >> >>> In the Java/Scala, you can add the staging repository to your
> projects
> >> >>> resolvers and test
> >> >>> with the RC (make sure to clean up the artifact cache before/after
> so
> >> >>> you don't end up building with an out of date RC going forward).
> >> >>>
> >> >>> ===========================================
> >> >>> What should happen to JIRA tickets still targeting 3.5.2?
> >> >>> ===========================================
> >> >>>
> >> >>> The current list of open tickets targeted at 3.5.2 can be found at:
> >> >>> https://issues.apache.org/jira/projects/SPARK and search for
> >> >>> "Target Version/s" = 3.5.2
> >> >>>
> >> >>> Committers should look at those and triage. Extremely important bug
> >> >>> fixes, documentation, and API tweaks that impact compatibility
> should
> >> >>> be worked on immediately. Everything else please retarget to an
> >> >>> appropriate release.
> >> >>>
> >> >>> ==================
> >> >>> But my bug isn't fixed?
> >> >>> ==================
> >> >>>
> >> >>> In order to make timely releases, we will typically not hold the
> >> >>> release unless the bug in question is a regression from the previous
> >> >>> release. That being said, if there is something which is a
> regression
> >> >>> that has not been correctly targeted please ping me or a committer
> to
> >> >>> help target the issue.
> >> >>>
> >> >>> Thanks,
> >> >>> Kent Yao
> >> >>>
> >> >>>
> ---------------------------------------------------------------------
> >> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >> >>>
> >>
> >> ---------------------------------------------------------------------
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>

Reply via email to