Thank you, Maxim!
Dongjoon.
On Thu, May 19, 2022 at 11:49 PM Maxim Gekk
wrote:
> Hi All,
>
> The voting for Spark 3.3.0 RC2 has failed since there aren't enough +1 and
> due to reported bugs. I will prepare RC3 at the beginning of next week.
>
> All known issues have been resolved in 3.3
Hi All,
The voting for Spark 3.3.0 RC2 has failed since there aren't enough +1 and
due to reported bugs. I will prepare RC3 at the beginning of next week.
All known issues have been resolved in 3.3 already (at least the issues
reported in the thread). Please, test the current branch-3.3 and
Hi,
When testing out Spark 3.3.0 on our production spark workload it was
noticed that https://issues.apache.org/jira/browse/SPARK-38681 is
actually a regression from 3.2 (I did not know this a the time of
creating the ticket) seem like the bug was introduced in
Thanks for the quick fix, Gengliang.
BR,
Kent
Gengliang Wang 于2022年5月19日周四 18:25写道:
>
> Hi Kent and Wenchen,
>
> Thanks for reporting. I just created
> https://github.com/apache/spark/pull/36609 to fix the issue.
>
> Gengliang
>
> On Thu, May 19, 2022 at 5:40 PM Wenchen Fan wrote:
>>
>> I
Hi Kent and Wenchen,
Thanks for reporting. I just created
https://github.com/apache/spark/pull/36609 to fix the issue.
Gengliang
On Thu, May 19, 2022 at 5:40 PM Wenchen Fan wrote:
> I think it should have been fixed by
>
I think it should have been fixed by
https://github.com/apache/spark/commit/0fdb6757946e2a0991256a3b73c0c09d6e764eed
. Maybe the fix is not completed...
On Thu, May 19, 2022 at 2:16 PM Kent Yao wrote:
> Thanks, Maxim.
>
> Leave my -1 for this release candidate.
>
> Unfortunately, I don't know
Thanks, Maxim.
Leave my -1 for this release candidate.
Unfortunately, I don't know which PR fixed this.
Does anyone happen to know?
BR,
Kent Yao
Maxim Gekk 于2022年5月19日周四 13:42写道:
>
> Hi Kent,
>
> > Shall we backport the fix from the master to 3.3 too?
>
> Yes, we shall.
>
> Maxim Gekk
>
>
Hi Kent,
> Shall we backport the fix from the master to 3.3 too?
Yes, we shall.
Maxim Gekk
Software Engineer
Databricks, Inc.
On Thu, May 19, 2022 at 6:44 AM Kent Yao wrote:
> Hi,
>
> I verified the simple case below with the binary release, and it looks
> like a bug to me.
>
>
Hi,
I verified the simple case below with the binary release, and it looks
like a bug to me.
bin/spark-sql -e "select date '2018-11-17' > 1"
Error in query: Invalid call to toAttribute on unresolved object;
'Project [unresolvedalias((2018-11-17 > 1), None)]
+- OneRowRelation
Both 3.2 releases
-1. There is a regression: https://github.com/apache/spark/pull/36595
On Wed, May 18, 2022 at 4:11 PM Martin Grigorov
wrote:
> Hi,
>
> [X] +1 Release this package as Apache Spark 3.3.0
>
> Tested:
> - make local distribution from sources (with ./dev/make-distribution.sh
> --tgz --name
Hi,
[X] +1 Release this package as Apache Spark 3.3.0
Tested:
- make local distribution from sources (with ./dev/make-distribution.sh
--tgz --name with-volcano -Pkubernetes,volcano,hadoop-3)
- create a Docker image (with JDK 11)
- run Pi example on
-- local
-- Kubernetes with default scheduler
There might be other blockers. Lets wait and see.
On Tue, May 17, 2022 at 8:59 PM beliefer wrote:
> OK. let it into 3.3.1
>
>
> 在 2022-05-17 18:59:13,"Hyukjin Kwon" 写道:
>
> I think most users won't be affected since aggregate pushdown is disabled
> by default.
>
> On Tue, 17 May 2022 at 19:53,
OK. let it into 3.3.1
在 2022-05-17 18:59:13,"Hyukjin Kwon" 写道:
I think most users won't be affected since aggregate pushdown is disabled by
default.
On Tue, 17 May 2022 at 19:53, beliefer wrote:
If we not contains https://github.com/apache/spark/pull/36556, we will break
change when
And seems like it won't break it because adding a new method won't break
binary compatibility.
On Tue, 17 May 2022 at 19:59, Hyukjin Kwon wrote:
> I think most users won't be affected since aggregate pushdown is disabled
> by default.
>
> On Tue, 17 May 2022 at 19:53, beliefer wrote:
>
>> If
I think most users won't be affected since aggregate pushdown is disabled
by default.
On Tue, 17 May 2022 at 19:53, beliefer wrote:
> If we not contains https://github.com/apache/spark/pull/36556, we will
> break change when we merge it into 3.3.1
>
> At 2022-05-17 18:26:12, "Hyukjin Kwon"
If we not contains https://github.com/apache/spark/pull/36556, we will break
change when we merge it into 3.3.1
At 2022-05-17 18:26:12, "Hyukjin Kwon" wrote:
We need add https://github.com/apache/spark/pull/36556 to RC2.
We will likely have to change the version being added if RC2 passes.
We need add https://github.com/apache/spark/pull/36556 to RC2.
We will likely have to change the version being added if RC2 passes.
Since this is a new API/improvement, I would prefer to not block the
release by that.
On Tue, 17 May 2022 at 19:19, beliefer wrote:
> We need add
We need add https://github.com/apache/spark/pull/36556 to RC2.
在 2022-05-17 17:37:13,"Hyukjin Kwon" 写道:
That seems like a test-only issue. I made a quick followup at
https://github.com/apache/spark/pull/36576.
On Tue, 17 May 2022 at 03:56, Sean Owen wrote:
I'm still seeing failures
That seems like a test-only issue. I made a quick followup at
https://github.com/apache/spark/pull/36576.
On Tue, 17 May 2022 at 03:56, Sean Owen wrote:
> I'm still seeing failures related to the function registry, like:
>
> ExpressionsSchemaSuite:
> - Check schemas for expression examples ***
I'm still seeing failures related to the function registry, like:
ExpressionsSchemaSuite:
- Check schemas for expression examples *** FAILED ***
396 did not equal 398 Expected 396 blocks in result file but got 398. Try
regenerating the result files. (ExpressionsSchemaSuite.scala:161)
-
Please vote on releasing the following candidate as
Apache Spark version 3.3.0.
The vote is open until 11:59pm Pacific time May 19th and passes if a
majority +1 PMC votes are cast, with a minimum of 3 +1 votes.
[ ] +1 Release this package as Apache Spark 3.3.0
[ ] -1 Do not release this package
21 matches
Mail list logo