+1 Thanks Xinrong.
On Mon, Apr 3, 2023 at 12:35 PM Dongjoon Hyun <dongjoon.h...@gmail.com> wrote: > > +1 > > I also verified that RC5 has SBOM artifacts. > > https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.12/3.4.0/spark-core_2.12-3.4.0-cyclonedx.json > https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.13/3.4.0/spark-core_2.13-3.4.0-cyclonedx.json > > Thanks, > Dongjoon. > > > > On Mon, Apr 3, 2023 at 1:57 AM yangjie01 <yangji...@baidu.com> wrote: >> >> +1, checked Java 17 + Scala 2.13 + Python 3.10.10. >> >> >> >> 发件人: Herman van Hovell <her...@databricks.com.INVALID> >> 日期: 2023年3月31日 星期五 12:12 >> 收件人: Sean Owen <sro...@apache.org> >> 抄送: Xinrong Meng <xinrong.apa...@gmail.com>, dev <dev@spark.apache.org> >> 主题: Re: [VOTE] Release Apache Spark 3.4.0 (RC5) >> >> >> >> +1 >> >> >> >> On Thu, Mar 30, 2023 at 11:05 PM Sean Owen <sro...@apache.org> wrote: >> >> +1 same result from me as last time. >> >> >> >> On Thu, Mar 30, 2023 at 3:21 AM Xinrong Meng <xinrong.apa...@gmail.com> >> wrote: >> >> Please vote on releasing the following candidate(RC5) as Apache Spark >> version 3.4.0. >> >> The vote is open until 11:59pm Pacific time April 4th and passes if a >> majority +1 PMC votes are cast, with a minimum of 3 +1 votes. >> >> [ ] +1 Release this package as Apache Spark 3.4.0 >> [ ] -1 Do not release this package because ... >> >> To learn more about Apache Spark, please see http://spark.apache.org/ >> >> The tag to be voted on is v3.4.0-rc5 (commit >> f39ad617d32a671e120464e4a75986241d72c487): >> https://github.com/apache/spark/tree/v3.4.0-rc5 >> >> The release files, including signatures, digests, etc. can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-bin/ >> >> Signatures used for Spark RCs can be found in this file: >> https://dist.apache.org/repos/dist/dev/spark/KEYS >> >> The staging repository for this release can be found at: >> https://repository.apache.org/content/repositories/orgapachespark-1439 >> >> The documentation corresponding to this release can be found at: >> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-docs/ >> >> The list of bug fixes going into 3.4.0 can be found at the following URL: >> https://issues.apache.org/jira/projects/SPARK/versions/12351465 >> >> This release is using the release script of the tag v3.4.0-rc5. >> >> >> >> >> >> FAQ >> >> ========================= >> How can I help test this release? >> ========================= >> If you are a Spark user, you can help us test this release by taking >> an existing Spark workload and running on this release candidate, then >> reporting any regressions. >> >> If you're working in PySpark you can set up a virtual env and install >> the current RC and see if anything important breaks, in the Java/Scala >> you can add the staging repository to your projects resolvers and test >> with the RC (make sure to clean up the artifact cache before/after so >> you don't end up building with an out of date RC going forward). >> >> =========================================== >> What should happen to JIRA tickets still targeting 3.4.0? >> =========================================== >> The current list of open tickets targeted at 3.4.0 can be found at: >> https://issues.apache.org/jira/projects/SPARK and search for "Target >> Version/s" = 3.4.0 >> >> Committers should look at those and triage. Extremely important bug >> fixes, documentation, and API tweaks that impact compatibility should >> be worked on immediately. Everything else please retarget to an >> appropriate release. >> >> ================== >> But my bug isn't fixed? >> ================== >> In order to make timely releases, we will typically not hold the >> release unless the bug in question is a regression from the previous >> release. That being said, if there is something which is a regression >> that has not been correctly targeted please ping me or a committer to >> help target the issue. >> >> >> >> Thanks, >> >> Xinrong Meng >> >> --------------------------------------------------------------------- To unsubscribe e-mail: dev-unsubscr...@spark.apache.org