cxzl25 commented on PR #36808:
URL: https://github.com/apache/spark/pull/36808#issuecomment-1150693473
Duplicate #32009
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
cxzl25 closed pull request #36808: [SPARK-39415][CORE] Local mode supports
HadoopDelegationTokenManager
URL: https://github.com/apache/spark/pull/36808
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
HyukjinKwon commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150686914
Python docs passed.
Merged to master, branch-3.3 and branch-3.2.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
HyukjinKwon closed pull request #36813: [SPARK-39421][PYTHON][DOCS] Pin the
docutils version <0.18 in documentation build
URL: https://github.com/apache/spark/pull/36813
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
AngersZh commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893074610
##
core/src/test/scala/org/apache/spark/scheduler/OutputCommitCoordinatorSuite.scala:
##
@@ -187,12 +181,6 @@ class OutputCommitCoordinatorSuite extends
AngersZh commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893072393
##
core/src/main/scala/org/apache/spark/SparkContext.scala:
##
@@ -461,7 +467,8 @@ class SparkContext(config: SparkConf) extends Logging {
cloud-fan commented on code in PR #36663:
URL: https://github.com/apache/spark/pull/36663#discussion_r893067774
##
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/GeneralScalarExpression.java:
##
@@ -196,6 +196,90 @@
*Since version: 3.4.0
*
*
cloud-fan commented on code in PR #36663:
URL: https://github.com/apache/spark/pull/36663#discussion_r893067221
##
sql/core/src/main/scala/org/apache/spark/sql/catalyst/util/V2ExpressionBuilder.scala:
##
@@ -259,6 +259,55 @@ class V2ExpressionBuilder(
} else {
cloud-fan commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r893062401
##
core/src/main/resources/error/error-classes.json:
##
@@ -551,5 +551,10 @@
"Writing job aborted"
],
"sqlState" : "4"
+ },
+
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893061431
##
core/src/test/scala/org/apache/spark/scheduler/OutputCommitCoordinatorSuite.scala:
##
@@ -187,12 +181,6 @@ class OutputCommitCoordinatorSuite extends SparkFunSuite
HeartSaVioR commented on PR #36737:
URL: https://github.com/apache/spark/pull/36737#issuecomment-1150658562
General comment from what I see in review comments:
I see you repeat the explanation of the code you changed; I don't think
reviewers asked about the detailed explanation of
HyukjinKwon commented on PR #36816:
URL: https://github.com/apache/spark/pull/36816#issuecomment-1150648303
Thanks for updating the guide @Yikun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
HyukjinKwon commented on PR #36814:
URL: https://github.com/apache/spark/pull/36814#issuecomment-1150647932
Hey thanks for letting me know. https://github.com/apache/spark/pull/36813
should fix that.
--
This is an automated message from the Apache Git Service.
To respond to the message,
huaxingao commented on PR #36814:
URL: https://github.com/apache/spark/pull/36814#issuecomment-1150646179
@HyukjinKwon
The python doc generation failed. I saw the same error in other PRs too.
```
/__w/spark/spark/docs/_plugins/copy_api_dirs.rb:130:in `':
Python doc generation
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893049422
##
.github/workflows/build_and_test.yml:
##
@@ -547,6 +547,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-35375.
# Pin the
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893049087
##
dev/requirements.txt:
##
@@ -35,6 +35,7 @@ numpydoc
jinja2<3.0.0
sphinx<3.1.0
sphinx-plotly-directive
+docutils~=1.7.0
Review Comment:
```suggestion
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893047340
##
dev/requirements.txt:
##
@@ -35,6 +35,7 @@ numpydoc
jinja2<3.0.0
sphinx<3.1.0
sphinx-plotly-directive
+docutils<1.7.0
Review Comment:
```suggestion
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893047298
##
.github/workflows/build_and_test.yml:
##
@@ -547,6 +547,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-35375.
# Pin the
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893047077
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m
Yikun commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893046990
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m pip
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893046599
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
Review Comment:
itholic commented on PR #36793:
URL: https://github.com/apache/spark/pull/36793#issuecomment-1150640677
Otherwise looks good if test passes
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
itholic commented on PR #36793:
URL: https://github.com/apache/spark/pull/36793#issuecomment-1150640100
Seems like it would fixed when https://github.com/apache/spark/pull/36813 is
merged.
Let's rebase after then.
--
This is an automated message from the Apache Git Service.
To
Yikun commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893045350
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m pip
Yikun commented on PR #36816:
URL: https://github.com/apache/spark/pull/36816#issuecomment-1150637320
@LuciferYang Thanks for reminder, I guess I need a reabase after doctest
fixed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
LuciferYang commented on PR #36816:
URL: https://github.com/apache/spark/pull/36816#issuecomment-1150637062
> @LuciferYang Hmm, this is not for fix doc test failed, :). It's for plus
migration doc for
[SPARK-38819](https://issues.apache.org/jira/browse/SPARK-38819)
OK
--
This is
Yikun commented on PR #36816:
URL: https://github.com/apache/spark/pull/36816#issuecomment-1150636796
@LuciferYang Hmm, this is not for fix doc test failed, :). It's for plus
migration doc for SPARK-38819
--
This is an automated message from the Apache Git Service.
To respond to the
nyingping commented on code in PR #36737:
URL: https://github.com/apache/spark/pull/36737#discussion_r893042194
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/Analyzer.scala:
##
@@ -3963,8 +3966,10 @@ object TimeWindowing extends Rule[LogicalPlan] {
LuciferYang commented on PR #36816:
URL: https://github.com/apache/spark/pull/36816#issuecomment-1150634766
@Yikun Will the current pr fix
[SPARK-39424](https://issues.apache.org/jira/browse/SPARK-39424)
--
This is an automated message from the Apache Git Service.
To respond to
LuciferYang commented on PR #36807:
URL: https://github.com/apache/spark/pull/36807#issuecomment-1150633518
There are no official release notes yet
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
Yikun opened a new pull request, #36816:
URL: https://github.com/apache/spark/pull/36816
### What changes were proposed in this pull request?
Add migration guide for pandas 1.4 behavior changes:
* SPARK-39054 https://github.com/apache/spark/pull/36581: In Spark 3.4, if
Pandas
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893038044
##
core/src/test/scala/org/apache/spark/scheduler/OutputCommitCoordinatorSuite.scala:
##
@@ -187,8 +188,8 @@ class OutputCommitCoordinatorSuite extends SparkFunSuite
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893037549
##
core/src/main/scala/org/apache/spark/SparkEnv.scala:
##
@@ -423,6 +432,7 @@ object SparkEnv extends Logging {
envInstance
}
+ // scalastyle:on argcount
gengliangwang commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r893036298
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3828,6 +3828,15 @@ object SQLConf {
.booleanConf
yaooqinn commented on code in PR #36810:
URL: https://github.com/apache/spark/pull/36810#discussion_r893034665
##
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/parquet/ParquetPartitionDiscoverySuite.scala:
##
@@ -1259,6 +1259,14 @@ class
HyukjinKwon commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150621696
It actually fails with 1.7.0 too
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
LuciferYang commented on PR #36807:
URL: https://github.com/apache/spark/pull/36807#issuecomment-1150621586
`Run documentation build` failed:
```
/__w/spark/spark/python/pyspark/pandas/supported_api_gen.py:101:
UserWarning: Warning: Latest version of pandas(>=1.4.0) is required to
cloud-fan commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150620951
I think higher version is better? maybe we should change the docker file
later (after 3.3 is released...)
--
This is an automated message from the Apache Git Service.
To respond to
HyukjinKwon commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150620405
I matched the version to Dockerfile.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
zhengruifeng opened a new pull request, #35250:
URL: https://github.com/apache/spark/pull/35250
### What changes were proposed in this pull request?
1, override `maxRowsPerPartition` in
`Sort`,`Expand`,`Sample`,`CollectMetrics`;
2, override `maxRows` in
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r893029627
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m
cloud-fan closed pull request #36586: [SPARK-39236][SQL] Make CreateTable and
ListTables be compatible with 3 layer namespace
URL: https://github.com/apache/spark/pull/36586
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
cloud-fan commented on PR #36586:
URL: https://github.com/apache/spark/pull/36586#issuecomment-1150618882
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
cloud-fan commented on code in PR #36586:
URL: https://github.com/apache/spark/pull/36586#discussion_r893029290
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/SessionCatalog.scala:
##
@@ -965,6 +965,10 @@ class SessionCatalog(
singhpk234 commented on PR #36810:
URL: https://github.com/apache/spark/pull/36810#issuecomment-1150618735
@cloud-fan, This seems to be introduced via
[commit](https://github.com/apache/spark/commit/fc29c91f27d866502f5b6cc4261d4943b57e),
dongjoon-hyun commented on PR #36787:
URL: https://github.com/apache/spark/pull/36787#issuecomment-1150615271
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun closed pull request #36787: [SPARK-39387][FOLLOWUP][TESTS] Add a
test case for HIVE-25190
URL: https://github.com/apache/spark/pull/36787
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
beliefer commented on PR #36805:
URL: https://github.com/apache/spark/pull/36805#issuecomment-1150613902
@huaxingao @cloud-fan Thanks a lot!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
mridulm commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r893022070
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -576,6 +661,7 @@ public MergeStatuses
mridulm commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r893021304
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -992,6 +1233,45 @@ AppShufflePartitionInfo
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r893019308
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -734,11 +920,35 @@ case class Pmod(
override def nullable:
AngersZh commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r893017815
##
core/src/main/scala/org/apache/spark/SparkContext.scala:
##
@@ -461,7 +467,8 @@ class SparkContext(config: SparkConf) extends Logging {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r893017456
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -734,11 +920,35 @@ case class Pmod(
override def nullable:
dongjoon-hyun commented on PR #36815:
URL: https://github.com/apache/spark/pull/36815#issuecomment-1150604227
Thank you, @sunchao and @cloud-fan .
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
Eugene-Mark commented on PR #36499:
URL: https://github.com/apache/spark/pull/36499#issuecomment-1150603723
retest this please
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #36815:
URL: https://github.com/apache/spark/pull/36815#issuecomment-1150603278
Thank you, @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon commented on PR #36815:
URL: https://github.com/apache/spark/pull/36815#issuecomment-1150603145
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
dongjoon-hyun commented on PR #36815:
URL: https://github.com/apache/spark/pull/36815#issuecomment-1150602874
cc @maryannxue , @cloud-fan , @HyukjinKwon , @sunchao
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun commented on PR #34929:
URL: https://github.com/apache/spark/pull/34929#issuecomment-1150601161
Here is a followup.
- https://github.com/apache/spark/pull/36815
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
dongjoon-hyun opened a new pull request, #36815:
URL: https://github.com/apache/spark/pull/36815
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
HyukjinKwon commented on PR #36683:
URL: https://github.com/apache/spark/pull/36683#issuecomment-1150592704
Let me merge this in few days ... assuming that we're all good. Hopefully my
benchmark is good enough.
--
This is an automated message from the Apache Git Service.
To respond to
HyukjinKwon commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150589281
seems like Max already fixed it in
https://github.com/apache/spark/blob/master/dev/create-release/spark-rm/Dockerfile#L45
--
This is an automated message from the Apache Git
JoshRosen opened a new pull request, #36814:
URL: https://github.com/apache/spark/pull/36814
### What changes were proposed in this pull request?
This PR improves the error message that is thrown when trying to run `SHOW
CREATE TABLE` on a Hive table with an unsupported serde.
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r893003521
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -953,15 +952,16 @@ case class Pmod(
// when we reach here,
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r893003240
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -953,15 +952,16 @@ case class Pmod(
// when we reach here,
cloud-fan closed pull request #36693: [SPARK-39349] Add a centralized
CheckError method for QA of error path
URL: https://github.com/apache/spark/pull/36693
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
cloud-fan commented on PR #36693:
URL: https://github.com/apache/spark/pull/36693#issuecomment-1150579697
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
cloud-fan commented on PR #36693:
URL: https://github.com/apache/spark/pull/36693#issuecomment-1150579622
The python doc issue is being fixed by
https://github.com/apache/spark/pull/36813
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
gengliangwang commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892996318
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/higherOrderFunctions.scala:
##
@@ -375,9 +375,17 @@ case class ArrayTransform(
//
cloud-fan commented on PR #36813:
URL: https://github.com/apache/spark/pull/36813#issuecomment-1150577614
how about the docker file we use for release?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
gengliangwang commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892992649
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3828,6 +3828,15 @@ object SQLConf {
.booleanConf
cloud-fan commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892991176
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3828,6 +3828,15 @@ object SQLConf {
.booleanConf
cloud-fan commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892990862
##
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala:
##
@@ -2012,4 +2012,12 @@ private[sql] object QueryExecutionErrors extends
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r892987817
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m
HyukjinKwon commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892987033
##
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala:
##
@@ -2012,4 +2012,12 @@ private[sql] object QueryExecutionErrors extends
HyukjinKwon commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892987033
##
sql/catalyst/src/main/scala/org/apache/spark/sql/errors/QueryExecutionErrors.scala:
##
@@ -2012,4 +2012,12 @@ private[sql] object QueryExecutionErrors extends
HyukjinKwon commented on code in PR #36812:
URL: https://github.com/apache/spark/pull/36812#discussion_r892986587
##
sql/catalyst/src/main/scala/org/apache/spark/sql/internal/SQLConf.scala:
##
@@ -3828,6 +3828,15 @@ object SQLConf {
.booleanConf
HyukjinKwon commented on code in PR #36793:
URL: https://github.com/apache/spark/pull/36793#discussion_r892985507
##
python/pyspark/sql/tests/test_arrow.py:
##
@@ -495,6 +509,22 @@ def test_schema_conversion_roundtrip(self):
schema_rt = from_arrow_schema(arrow_schema)
HyukjinKwon commented on code in PR #36813:
URL: https://github.com/apache/spark/pull/36813#discussion_r892981613
##
.github/workflows/build_and_test.yml:
##
@@ -549,6 +549,7 @@ jobs:
# See also https://issues.apache.org/jira/browse/SPARK-38279.
python3.9 -m
HyukjinKwon opened a new pull request, #36813:
URL: https://github.com/apache/spark/pull/36813
### What changes were proposed in this pull request?
This PR fixes the Sphinx build failure below (see
https://github.com/singhpk234/spark/runs/6799026458?check_suite_focus=true):
wangyum commented on code in PR #36810:
URL: https://github.com/apache/spark/pull/36810#discussion_r892980250
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/PartitioningUtils.scala:
##
@@ -359,7 +359,12 @@ object PartitioningUtils extends SQLConfHelper{
cloud-fan commented on PR #36810:
URL: https://github.com/apache/spark/pull/36810#issuecomment-1150557713
do we know which commit caused this issue? is it a 3.3 only bug?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
github-actions[bot] commented on PR #35605:
URL: https://github.com/apache/spark/pull/35605#issuecomment-1150538958
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
wangyum commented on PR #36786:
URL: https://github.com/apache/spark/pull/36786#issuecomment-1150532390
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
wangyum closed pull request #36786: [SPARK-39400][SQL] spark-sql should remove
hive resource dir in all case
URL: https://github.com/apache/spark/pull/36786
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
huaxingao commented on PR #36810:
URL: https://github.com/apache/spark/pull/36810#issuecomment-1150517588
The Python doc generation failure seems to be irrelevant. All the other
tests passed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please
ueshin opened a new pull request, #36812:
URL: https://github.com/apache/spark/pull/36812
### What changes were proposed in this pull request?
Fixes `ArraySort` to throw an exception when the comparator returns `null`.
Also updates the doc to follow the corrected behavior.
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892940331
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -576,6 +661,7 @@ public MergeStatuses
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892938847
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -992,6 +1233,45 @@ AppShufflePartitionInfo
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892923191
##
resource-managers/yarn/src/test/scala/org/apache/spark/network/shuffle/ShuffleTestAccessor.scala:
##
@@ -44,14 +48,113 @@ object ShuffleTestAccessor {
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892923038
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -655,6 +743,197 @@ public void registerExecutor(String
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892921497
##
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/RemoteBlockPushResolver.java:
##
@@ -203,15 +237,16 @@ private AppShufflePartitionInfo
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892921190
##
common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:
##
@@ -451,6 +472,7 @@ protected File initRecoveryDb(String dbName) {
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892921010
##
common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:
##
@@ -287,7 +301,13 @@ protected void serviceInit(Configuration
zhouyejoe commented on code in PR #35906:
URL: https://github.com/apache/spark/pull/35906#discussion_r892920621
##
common/network-yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java:
##
@@ -230,11 +241,14 @@ protected void serviceInit(Configuration
xinrong-databricks commented on code in PR #36793:
URL: https://github.com/apache/spark/pull/36793#discussion_r892890115
##
python/pyspark/sql/tests/test_arrow.py:
##
@@ -495,6 +509,22 @@ def test_schema_conversion_roundtrip(self):
schema_rt =
huaxingao commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1150422569
@Borjianamin98 I forgot that I need to add you to the contributors list
first. I just did and assigned the jira OK :)
--
This is an automated message from the Apache Git Service.
To
HyukjinKwon commented on code in PR #36793:
URL: https://github.com/apache/spark/pull/36793#discussion_r892865940
##
python/pyspark/sql/tests/test_arrow.py:
##
@@ -495,6 +509,22 @@ def test_schema_conversion_roundtrip(self):
schema_rt = from_arrow_schema(arrow_schema)
HyukjinKwon commented on code in PR #36793:
URL: https://github.com/apache/spark/pull/36793#discussion_r892865940
##
python/pyspark/sql/tests/test_arrow.py:
##
@@ -495,6 +509,22 @@ def test_schema_conversion_roundtrip(self):
schema_rt = from_arrow_schema(arrow_schema)
EnricoMi commented on code in PR #35965:
URL: https://github.com/apache/spark/pull/35965#discussion_r892865243
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/DataSourceV2ScanExecBase.scala:
##
@@ -138,6 +138,15 @@ trait DataSourceV2ScanExecBase extends
1 - 100 of 180 matches
Mail list logo