Hi Kent and Wenchen,

Thanks for reporting. I just created
https://github.com/apache/spark/pull/36609 to fix the issue.

Gengliang

On Thu, May 19, 2022 at 5:40 PM Wenchen Fan <cloud0...@gmail.com> wrote:

> I think it should have been fixed  by
> https://github.com/apache/spark/commit/0fdb6757946e2a0991256a3b73c0c09d6e764eed
> . Maybe the fix is not completed...
>
> On Thu, May 19, 2022 at 2:16 PM Kent Yao <y...@apache.org> wrote:
>
>> Thanks, Maxim.
>>
>> Leave my -1 for this release candidate.
>>
>> Unfortunately, I don't know which PR fixed this.
>> Does anyone happen to know?
>>
>> BR,
>> Kent Yao
>>
>> Maxim Gekk <maxim.g...@databricks.com> 于2022年5月19日周四 13:42写道:
>> >
>> > Hi Kent,
>> >
>> > > Shall we backport the fix from the master to 3.3 too?
>> >
>> > Yes, we shall.
>> >
>> > Maxim Gekk
>> >
>> > Software Engineer
>> >
>> > Databricks, Inc.
>> >
>> >
>> >
>> > On Thu, May 19, 2022 at 6:44 AM Kent Yao <y...@apache.org> wrote:
>> >>
>> >> Hi,
>> >>
>> >> I verified the simple case below with the binary release, and it looks
>> >> like a bug to me.
>> >>
>> >> bin/spark-sql -e "select date '2018-11-17' > 1"
>> >>
>> >> Error in query: Invalid call to toAttribute on unresolved object;
>> >> 'Project [unresolvedalias((2018-11-17 > 1), None)]
>> >> +- OneRowRelation
>> >>
>> >> Both 3.2 releases and the master branch work fine with correct errors
>> >> -  'due to data type mismatch'.
>> >>
>> >> Shall we backport the fix from the master to 3.3 too?
>> >>
>> >> Bests
>> >>
>> >> Kent Yao
>> >>
>> >>
>> >> Yuming Wang <wgy...@gmail.com> 于2022年5月18日周三 19:04写道:
>> >> >
>> >> > -1. There is a regression:
>> https://github.com/apache/spark/pull/36595
>> >> >
>> >> > On Wed, May 18, 2022 at 4:11 PM Martin Grigorov <
>> mgrigo...@apache.org> wrote:
>> >> >>
>> >> >> Hi,
>> >> >>
>> >> >> [X] +1 Release this package as Apache Spark 3.3.0
>> >> >>
>> >> >> Tested:
>> >> >> - make local distribution from sources (with
>> ./dev/make-distribution.sh --tgz --name with-volcano
>> -Pkubernetes,volcano,hadoop-3)
>> >> >> - create a Docker image (with JDK 11)
>> >> >> - run Pi example on
>> >> >> -- local
>> >> >> -- Kubernetes with default scheduler
>> >> >> -- Kubernetes with Volcano scheduler
>> >> >>
>> >> >> On both x86_64 and aarch64 !
>> >> >>
>> >> >> Regards,
>> >> >> Martin
>> >> >>
>> >> >>
>> >> >> On Mon, May 16, 2022 at 3:44 PM Maxim Gekk <
>> maxim.g...@databricks.com.invalid> wrote:
>> >> >>>
>> >> >>> Please vote on releasing the following candidate as Apache Spark
>> version 3.3.0.
>> >> >>>
>> >> >>> The vote is open until 11:59pm Pacific time May 19th and passes if
>> a majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
>> >> >>>
>> >> >>> [ ] +1 Release this package as Apache Spark 3.3.0
>> >> >>> [ ] -1 Do not release this package because ...
>> >> >>>
>> >> >>> To learn more about Apache Spark, please see
>> http://spark.apache.org/
>> >> >>>
>> >> >>> The tag to be voted on is v3.3.0-rc2 (commit
>> c8c657b922ac8fd8dcf9553113e11a80079db059):
>> >> >>> https://github.com/apache/spark/tree/v3.3.0-rc2
>> >> >>>
>> >> >>> The release files, including signatures, digests, etc. can be
>> found at:
>> >> >>> https://dist.apache.org/repos/dist/dev/spark/v3.3.0-rc2-bin/
>> >> >>>
>> >> >>> Signatures used for Spark RCs can be found in this file:
>> >> >>> https://dist.apache.org/repos/dist/dev/spark/KEYS
>> >> >>>
>> >> >>> The staging repository for this release can be found at:
>> >> >>>
>> https://repository.apache.org/content/repositories/orgapachespark-1403
>> >> >>>
>> >> >>> The documentation corresponding to this release can be found at:
>> >> >>> https://dist.apache.org/repos/dist/dev/spark/v3.3.0-rc2-docs/
>> >> >>>
>> >> >>> The list of bug fixes going into 3.3.0 can be found at the
>> following URL:
>> >> >>> https://issues.apache.org/jira/projects/SPARK/versions/12350369
>> >> >>>
>> >> >>> This release is using the release script of the tag v3.3.0-rc2.
>> >> >>>
>> >> >>>
>> >> >>> FAQ
>> >> >>>
>> >> >>> =========================
>> >> >>> How can I help test this release?
>> >> >>> =========================
>> >> >>> If you are a Spark user, you can help us test this release by
>> taking
>> >> >>> an existing Spark workload and running on this release candidate,
>> then
>> >> >>> reporting any regressions.
>> >> >>>
>> >> >>> If you're working in PySpark you can set up a virtual env and
>> install
>> >> >>> the current RC and see if anything important breaks, in the
>> Java/Scala
>> >> >>> you can add the staging repository to your projects resolvers and
>> test
>> >> >>> with the RC (make sure to clean up the artifact cache before/after
>> so
>> >> >>> you don't end up building with a out of date RC going forward).
>> >> >>>
>> >> >>> ===========================================
>> >> >>> What should happen to JIRA tickets still targeting 3.3.0?
>> >> >>> ===========================================
>> >> >>> The current list of open tickets targeted at 3.3.0 can be found at:
>> >> >>> https://issues.apache.org/jira/projects/SPARK and search for
>> "Target Version/s" = 3.3.0
>> >> >>>
>> >> >>> Committers should look at those and triage. Extremely important bug
>> >> >>> fixes, documentation, and API tweaks that impact compatibility
>> should
>> >> >>> be worked on immediately. Everything else please retarget to an
>> >> >>> appropriate release.
>> >> >>>
>> >> >>> ==================
>> >> >>> But my bug isn't fixed?
>> >> >>> ==================
>> >> >>> In order to make timely releases, we will typically not hold the
>> >> >>> release unless the bug in question is a regression from the
>> previous
>> >> >>> release. That being said, if there is something which is a
>> regression
>> >> >>> that has not been correctly targeted please ping me or a committer
>> to
>> >> >>> help target the issue.
>> >> >>>
>> >> >>> Maxim Gekk
>> >> >>>
>> >> >>> Software Engineer
>> >> >>>
>> >> >>> Databricks, Inc.
>> >>
>> >> ---------------------------------------------------------------------
>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>
>>
>> ---------------------------------------------------------------------
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>>

Reply via email to