Thanks all. I am cutting v3.4.0-rc6 now considering all blockers have been
resolved.

On Thu, Apr 6, 2023 at 1:07 AM yangjie01 <yangji...@baidu.com> wrote:

> I suggest placing the fix of SPARK-39696 in the next RC, which is an issue
> that will be triggered after Scala 2.13.7.
>
>
>
> 1.       https://issues.apache.org/jira/browse/SPARK-39696
>
> 2.       https://github.com/apache/spark/pull/40663
>
>
>
>
>
> Yang Jie
>
>
>
>
>
> *发件人**: *John Zhuge <jzh...@apache.org>
> *日期**: *2023年4月6日 星期四 13:59
> *收件人**: *Anton Okolnychyi <aokolnyc...@apache.org>
> *抄送**: *"dev@spark.apache.org" <dev@spark.apache.org>
> *主题**: *Re: [VOTE] Release Apache Spark 3.4.0 (RC5)
>
>
>
> Thanks all for taking care of this!
>
>
>
> On Wed, Apr 5, 2023 at 12:49 PM Anton Okolnychyi <aokolnyc...@apache.org>
> wrote:
>
> Thanks all!
>
> Let me check all exceptions and submit a PR. Will do it now if nobody
> created a PR yet.
>
> - Anton
>
> On 2023/04/05 19:09:13 Xiao Li wrote:
> > Hi, Anton,
> >
> > Could you please provide a complete list of exceptions that are being
> used
> > in the public connector API?
> >
> > Thanks,
> >
> > Xiao
> >
> > Xinrong Meng <xinrong.apa...@gmail.com> 于2023年4月5日周三 12:06写道:
> >
> > > Thank you!
> > >
> > > I created a blocker Jira for that for easier tracking:
> > > https://issues.apache.org/jira/browse/SPARK-43041
> <https://mailshield.baidu.com/check?q=mcHWoJjiY1RAplV80yUOpJO3%2fu%2fsmmf4zLblnHqeC3Y5zrK2gNPw9jnagWC0J34qiY99%2b%2bfa7sI%3d>
> .
> > >
> > >
> > > On Wed, Apr 5, 2023 at 11:20 AM Gengliang Wang <ltn...@gmail.com>
> wrote:
> > >
> > >> Hi Anton,
> > >>
> > >> +1 for adding the old constructors back!
> > >> Could you raise a PR for this? I will review it ASAP.
> > >>
> > >> Thanks
> > >> Gengliang
> > >>
> > >> On Wed, Apr 5, 2023 at 9:37 AM Anton Okolnychyi <
> aokolnyc...@apache.org>
> > >> wrote:
> > >>
> > >>> Sorry, I think my last message did not land on the list.
> > >>>
> > >>> I have a question about changes to exceptions used in the public
> > >>> connector API, such as NoSuchTableException and
> TableAlreadyExistsException.
> > >>>
> > >>> I consider those as part of the public Catalog API (TableCatalog uses
> > >>> them in method definitions). However, it looks like PR #37887 has
> changed
> > >>> them in an incompatible way. Old constructors accepting Identifier
> objects
> > >>> got removed. The only way to construct such exceptions is either by
> passing
> > >>> database and table strings or Scala Seq. Shall we add back old
> constructors
> > >>> to avoid breaking connectors?
> > >>>
> > >>> [1] - https://github.com/apache/spark/pull/37887/
> <https://mailshield.baidu.com/check?q=eTzqq54SDDP2pkfbp87cV6WLiwbAvbIY1GtvevqCZYYh6V7%2fVDkavmEzHsT%2b2O7%2b>
> > >>> [2] - https://issues.apache.org/jira/browse/SPARK-40360
> <https://mailshield.baidu.com/check?q=fPD4HayfvNRoTBESQMUinuPgFL%2bUdrOyZLTXszZ6KSviUuBIZfazXVrxlvOcxiDuRfGpuLY%2b4ic%3d>
> > >>> [3] -
> > >>>
> https://github.com/apache/spark/blob/master/sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/NoSuchItemException.scala
> <https://mailshield.baidu.com/check?q=%2fQESBNKAfBp%2f9ngG%2f03w8FsS0%2f0J0puIYErEmIfT3A5eNfoQltV%2b7aj4SC0YFHxing%2bnILCIMeOolxOHZN%2bNhkdRUfomRRVTigRwNU4FxeRZDmQnH%2bQsVi8fKeiBYtqEfefqJ0qBM0QAWkl%2fVzX3d0V%2biz8LL6CQqG0u76q0FJMYMHEmWw0ays4mzfU%3d>
> > >>>
> > >>> - Anton
> > >>>
> > >>> On 2023/04/05 16:23:52 Xinrong Meng wrote:
> > >>> > Considering the above blockers have been resolved, I am about to
> > >>> > cut v3.4.0-rc6 if no objections.
> > >>> >
> > >>> > On Tue, Apr 4, 2023 at 8:20 AM Xinrong Meng <
> xinrong.apa...@gmail.com>
> > >>> > wrote:
> > >>> >
> > >>> > > Thank you Wenchen for the report. I marked them as blockers just
> now.
> > >>> > >
> > >>> > > On Tue, Apr 4, 2023 at 10:52 AM Wenchen Fan <cloud0...@gmail.com
> >
> > >>> wrote:
> > >>> > >
> > >>> > >> Sorry for the last-minute change, but we found two wrong
> behaviors
> > >>> and
> > >>> > >> want to fix them before the release:
> > >>> > >>
> > >>> > >> https://github.com/apache/spark/pull/40641
> <https://mailshield.baidu.com/check?q=TIE0ICD63WoYYUSpnlZoEkevmhXZB91Ibz6R5NooGm%2fnkO4mbMCRI4s6Jv%2f1Jh3q>
> > >>> > >> We missed a corner case when the input index for `array_insert`
> is
> > >>> 0. It
> > >>> > >> should fail as 0 is an invalid index.
> > >>> > >>
> > >>> > >> https://github.com/apache/spark/pull/40623
> <https://mailshield.baidu.com/check?q=kdAmtXPy1E4YvhgdYXm3hYJT6RaNNUFEiUWhU%2fGfIjzwTmbyS5X1do1zpMkObNn3>
> > >>> > >> We found some usability issues with a new API and need to change
> > >>> the API
> > >>> > >> to fix it. If people have concerns we can also remove the new
> API
> > >>> entirely.
> > >>> > >>
> > >>> > >> Thus I'm -1 to this RC. I'll merge these 2 PRs today if no
> > >>> objections.
> > >>> > >>
> > >>> > >> Thanks,
> > >>> > >> Wenchen
> > >>> > >>
> > >>> > >> On Tue, Apr 4, 2023 at 3:47 AM L. C. Hsieh <vii...@gmail.com>
> > >>> wrote:
> > >>> > >>
> > >>> > >>> +1
> > >>> > >>>
> > >>> > >>> Thanks Xinrong.
> > >>> > >>>
> > >>> > >>> On Mon, Apr 3, 2023 at 12:35 PM Dongjoon Hyun <
> > >>> dongjoon.h...@gmail.com>
> > >>> > >>> wrote:
> > >>> > >>> >
> > >>> > >>> > +1
> > >>> > >>> >
> > >>> > >>> > I also verified that RC5 has SBOM artifacts.
> > >>> > >>> >
> > >>> > >>> >
> > >>> > >>>
> > >>>
> https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.12/3.4.0/spark-core_2.12-3.4.0-cyclonedx.json
> <https://mailshield.baidu.com/check?q=MkIVp084igPVIp6H%2fLEqpKyipseN8SeBmU0FQbGJlZlUTqmm%2b0MiWROUYi8%2fymh2U4PWyt068adpfIzQEvOERqSSWirZ3OAU1hT%2fcM43NJ7zrf%2f1t2F2p%2b%2fAreDqJhfqa8XL3iu5fhLPF6Lrdyl7Bvdr7ELVcqrplKGMk%2fwGgcstIXNlxAYsusp1WQKADZxqhdKC%2fpXTSYE%3d>
> > >>> > >>> >
> > >>> > >>>
> > >>>
> https://repository.apache.org/content/repositories/orgapachespark-1439/org/apache/spark/spark-core_2.13/3.4.0/spark-core_2.13-3.4.0-cyclonedx.json
> <https://mailshield.baidu.com/check?q=D4WlhkyXWneSOsLHx86Y7zwVWiSde6yhKXw%2bbDFVa%2f1ZFJQFuZBfPyXGG4GyaYQysMI27pLxe%2fZMGVVUaxxgvrWGjBwKuDPAgf4PoMLPmZuk3%2bnOCslCksn4pd%2fjhBtrlVdRpkzz9YGpHEeByCDAHCKQ3yF%2bMa%2bUDG8T6x8sAqEmErSzcDhW0Gmq871iKXlGKxzaaH1IF%2fo%3d>
> > >>> > >>> >
> > >>> > >>> > Thanks,
> > >>> > >>> > Dongjoon.
> > >>> > >>> >
> > >>> > >>> >
> > >>> > >>> >
> > >>> > >>> > On Mon, Apr 3, 2023 at 1:57 AM yangjie01 <
> yangji...@baidu.com>
> > >>> wrote:
> > >>> > >>> >>
> > >>> > >>> >> +1, checked Java 17 + Scala 2.13 + Python 3.10.10.
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> 发件人: Herman van Hovell <her...@databricks.com.INVALID>
> > >>> > >>> >> 日期: 2023年3月31日 星期五 12:12
> > >>> > >>> >> 收件人: Sean Owen <sro...@apache.org>
> > >>> > >>> >> 抄送: Xinrong Meng <xinrong.apa...@gmail.com>, dev <
> > >>> > >>> dev@spark.apache.org>
> > >>> > >>> >> 主题: Re: [VOTE] Release Apache Spark 3.4.0 (RC5)
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> +1
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> On Thu, Mar 30, 2023 at 11:05 PM Sean Owen <
> sro...@apache.org>
> > >>> wrote:
> > >>> > >>> >>
> > >>> > >>> >> +1 same result from me as last time.
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> On Thu, Mar 30, 2023 at 3:21 AM Xinrong Meng <
> > >>> > >>> xinrong.apa...@gmail.com> wrote:
> > >>> > >>> >>
> > >>> > >>> >> Please vote on releasing the following candidate(RC5) as
> Apache
> > >>> Spark
> > >>> > >>> version 3.4.0.
> > >>> > >>> >>
> > >>> > >>> >> The vote is open until 11:59pm Pacific time April 4th and
> > >>> passes if a
> > >>> > >>> majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
> > >>> > >>> >>
> > >>> > >>> >> [ ] +1 Release this package as Apache Spark 3.4.0
> > >>> > >>> >> [ ] -1 Do not release this package because ...
> > >>> > >>> >>
> > >>> > >>> >> To learn more about Apache Spark, please see
> > >>> http://spark.apache.org/
> <https://mailshield.baidu.com/check?q=eJcUboQ1HRRomPZKEwRzpl69wA8DbI%2fNIiRNsQ%3d%3d>
> > >>> > >>> >>
> > >>> > >>> >> The tag to be voted on is v3.4.0-rc5 (commit
> > >>> > >>> f39ad617d32a671e120464e4a75986241d72c487):
> > >>> > >>> >> https://github.com/apache/spark/tree/v3.4.0-rc5
> <https://mailshield.baidu.com/check?q=zW0h99t1z8fdFlY1RdT8a1jq3Npi42Lv55qhJGIzTGlKLb3NAhvPv6iKe0Tq%2bX3UfGOduQ%3d%3d>
> > >>> > >>> >>
> > >>> > >>> >> The release files, including signatures, digests, etc. can
> be
> > >>> found
> > >>> > >>> at:
> > >>> > >>> >>
> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-bin/
> <https://mailshield.baidu.com/check?q=ZkMrShLlEr0i6pgG1sp10%2fBHEbfD0UduFyTnT4x%2fGL2AMXrX21U1omzJoFawvcyMzupETtD346DBh8%2bq1pLfgg%3d%3d>
> > >>> > >>> >>
> > >>> > >>> >> Signatures used for Spark RCs can be found in this file:
> > >>> > >>> >> https://dist.apache.org/repos/dist/dev/spark/KEYS
> <https://mailshield.baidu.com/check?q=E6fHbSXEWw02TTJBpc3bfA9mi7ea0YiWcNHkm%2fDJxwlaWinGnMdaoO1PahHhgj00vKwcbElpuHA%3d>
> > >>> > >>> >>
> > >>> > >>> >> The staging repository for this release can be found at:
> > >>> > >>> >>
> > >>> > >>>
> > >>>
> https://repository.apache.org/content/repositories/orgapachespark-1439
> <https://mailshield.baidu.com/check?q=8aWnbD2Wrr51gNMtJ%2fMbRfPzHD%2f9Omn3oZQFdgKIPZdtRp0Q2jumLPG8D48qRgIxhQTLHIeL%2fDOpUNZLGZK%2bUwnA7ShRB1MpYnZoeg%3d%3d>
> > >>> > >>> >>
> > >>> > >>> >> The documentation corresponding to this release can be
> found at:
> > >>> > >>> >>
> https://dist.apache.org/repos/dist/dev/spark/v3.4.0-rc5-docs/
> <https://mailshield.baidu.com/check?q=lcbkC7lxtJe70g0J0GJXHVIeoXo6nRm3oIhDagBJ5MTw1%2b9gbq%2fq2ECuFR85a7TKJo2saLkzRLvtuwvryl71n%2b%2fzQcY%3d>
> > >>> > >>> >>
> > >>> > >>> >> The list of bug fixes going into 3.4.0 can be found at the
> > >>> following
> > >>> > >>> URL:
> > >>> > >>> >>
> https://issues.apache.org/jira/projects/SPARK/versions/12351465
> <https://mailshield.baidu.com/check?q=hdSxPMAr37WGNHJRNA4Mh1JkSlqjUL%2bM8BgEclwc23ePHCBzkAjvhgnZa0N7SPRWAcgfoLXjX43CxJXmKnDj0LIElJs%3d>
> > >>> > >>> >>
> > >>> > >>> >> This release is using the release script of the tag
> v3.4.0-rc5.
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> FAQ
> > >>> > >>> >>
> > >>> > >>> >> =========================
> > >>> > >>> >> How can I help test this release?
> > >>> > >>> >> =========================
> > >>> > >>> >> If you are a Spark user, you can help us test this release
> by
> > >>> taking
> > >>> > >>> >> an existing Spark workload and running on this release
> > >>> candidate, then
> > >>> > >>> >> reporting any regressions.
> > >>> > >>> >>
> > >>> > >>> >> If you're working in PySpark you can set up a virtual env
> and
> > >>> install
> > >>> > >>> >> the current RC and see if anything important breaks, in the
> > >>> Java/Scala
> > >>> > >>> >> you can add the staging repository to your projects
> resolvers
> > >>> and test
> > >>> > >>> >> with the RC (make sure to clean up the artifact cache
> > >>> before/after so
> > >>> > >>> >> you don't end up building with an out of date RC going
> forward).
> > >>> > >>> >>
> > >>> > >>> >> ===========================================
> > >>> > >>> >> What should happen to JIRA tickets still targeting 3.4.0?
> > >>> > >>> >> ===========================================
> > >>> > >>> >> The current list of open tickets targeted at 3.4.0 can be
> found
> > >>> at:
> > >>> > >>> >> https://issues.apache.org/jira/projects/SPARK
> <https://mailshield.baidu.com/check?q=4UUpJqq41y71Gnuj0qTUYo6hTjqzT7oytN6x%2fvgC5XUtQUC8MfJ77tj7K70O%2f1QMmNoa1A%3d%3d>
> and search for
> > >>> "Target
> > >>> > >>> Version/s" = 3.4.0
> > >>> > >>> >>
> > >>> > >>> >> Committers should look at those and triage. Extremely
> important
> > >>> bug
> > >>> > >>> >> fixes, documentation, and API tweaks that impact
> compatibility
> > >>> should
> > >>> > >>> >> be worked on immediately. Everything else please retarget
> to an
> > >>> > >>> >> appropriate release.
> > >>> > >>> >>
> > >>> > >>> >> ==================
> > >>> > >>> >> But my bug isn't fixed?
> > >>> > >>> >> ==================
> > >>> > >>> >> In order to make timely releases, we will typically not
> hold the
> > >>> > >>> >> release unless the bug in question is a regression from the
> > >>> previous
> > >>> > >>> >> release. That being said, if there is something which is a
> > >>> regression
> > >>> > >>> >> that has not been correctly targeted please ping me or a
> > >>> committer to
> > >>> > >>> >> help target the issue.
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>> >> Thanks,
> > >>> > >>> >>
> > >>> > >>> >> Xinrong Meng
> > >>> > >>> >>
> > >>> > >>> >>
> > >>> > >>>
> > >>> > >>>
> > >>> ---------------------------------------------------------------------
> > >>> > >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >>> > >>>
> > >>> > >>>
> > >>> >
> > >>>
> > >>> ---------------------------------------------------------------------
> > >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> > >>>
> > >>>
> >
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>
>
>
> --
>
> John Zhuge
>

Reply via email to