xinrong-databricks commented on code in PR #36793:
URL: https://github.com/apache/spark/pull/36793#discussion_r892863327
##
python/pyspark/sql/tests/test_session.py:
##
@@ -379,6 +381,54 @@ def test_use_custom_class_for_extensions(self):
)
+class
Borjianamin98 commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1150401390
> @Borjianamin98 Do you have a jira account? I tried to assign the jira to
you but can't find you.
My username in jira is `borjianamin` like what I created for issue
HyukjinKwon commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1150399434
I read e.g.,
https://lists.apache.org/thread/tcjh5wlthg21j519tl7o25cdo81792vr vs.
https://github.com/apache/spark/pull/25607#issuecomment-525745116
Using other branches is
huaxingao commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1150390322
@Borjianamin98 Do you have a jira account? I tried to assign the jira to you
but can't find you.
--
This is an automated message from the Apache Git Service.
To respond to the
huaxingao commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1150387353
Merged to master/3.3/3.2/3.1. Thanks @Borjianamin98 for your first
contribution and welcome to Spark community!
Also thanks @LuciferYang @dcoliversun for reviewing!
--
This is
huaxingao closed pull request #36781: [SPARK-39393][SQL] Parquet data source
only supports push-down predicate filters for non-repeated primitive types
URL: https://github.com/apache/spark/pull/36781
--
This is an automated message from the Apache Git Service.
To respond to the message,
huaxingao commented on PR #36810:
URL: https://github.com/apache/spark/pull/36810#issuecomment-1150378282
LGTM. Pending test results
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
mridulm commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r892802648
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -1068,25 +1086,61 @@ private[spark] class TaskSetManager(
* Check if the task
MaxGekk opened a new pull request, #36811:
URL: https://github.com/apache/spark/pull/36811
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
mridulm commented on PR #36709:
URL: https://github.com/apache/spark/pull/36709#issuecomment-1150310283
+CC @Ngone51 as well, since you had reviewed the original change.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
dongjoon-hyun commented on PR #36807:
URL: https://github.com/apache/spark/pull/36807#issuecomment-1150278895
Thank you always for your proactive contribution, @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
MaxGekk commented on PR #36753:
URL: https://github.com/apache/spark/pull/36753#issuecomment-1150249981
> 3.1 already does not have any transform with subqueries function ...
Could you list required PRs, please. Is it possible to extract only needed
functions from them?
--
This is
MaxGekk closed pull request #36804: [SPARK-39412][SQL] Exclude
IllegalStateException from Spark's internal errors
URL: https://github.com/apache/spark/pull/36804
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
MaxGekk commented on PR #36804:
URL: https://github.com/apache/spark/pull/36804#issuecomment-1150243540
Merging to master/3.3.
Thank you, @HeartSaVioR @cloud-fan for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
olaky commented on PR #36753:
URL: https://github.com/apache/spark/pull/36753#issuecomment-1150242302
@MaxGekk 3.1 already does not have any transform with subqueries function,
so I would have to backport this as well. I personally feel that this could be
a risky endeavour not worth doing
srielau commented on code in PR #36693:
URL: https://github.com/apache/spark/pull/36693#discussion_r892641206
##
core/src/main/scala/org/apache/spark/ErrorInfo.scala:
##
@@ -73,18 +73,20 @@ private[spark] object SparkThrowableHelper {
def getMessage(
errorClass:
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892631764
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
singhpk234 commented on PR #36810:
URL: https://github.com/apache/spark/pull/36810#issuecomment-1150161063
cc @srowen @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
singhpk234 opened a new pull request, #36810:
URL: https://github.com/apache/spark/pull/36810
### What changes were proposed in this pull request?
We should not try casting everything returned by
`removeLeadingZerosFromNumberTypePartition` to string, as it returns null value
cloud-fan opened a new pull request, #36809:
URL: https://github.com/apache/spark/pull/36809
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892098440
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892098440
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892593552
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -736,6 +737,24 @@ abstract class TypeCoercionBase {
}
}
+
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892575789
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r892566204
##
core/src/main/scala/org/apache/spark/SparkContext.scala:
##
@@ -461,7 +467,8 @@ class SparkContext(config: SparkConf) extends Logging {
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r892558108
##
core/src/main/scala/org/apache/spark/scheduler/OutputCommitCoordinator.scala:
##
@@ -155,9 +159,9 @@ private[spark] class OutputCommitCoordinator(conf:
SparkConf,
cloud-fan commented on code in PR #36693:
URL: https://github.com/apache/spark/pull/36693#discussion_r892541582
##
core/src/main/scala/org/apache/spark/SparkException.scala:
##
@@ -73,236 +101,340 @@ private[spark] case class ExecutorDeadException(message:
String)
*/
cloud-fan commented on code in PR #36693:
URL: https://github.com/apache/spark/pull/36693#discussion_r892535272
##
core/src/main/scala/org/apache/spark/ErrorInfo.scala:
##
@@ -73,18 +73,20 @@ private[spark] object SparkThrowableHelper {
def getMessage(
errorClass:
AmplabJenkins commented on PR #36806:
URL: https://github.com/apache/spark/pull/36806#issuecomment-1150061039
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r892521784
##
core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:
##
@@ -2588,6 +2588,14 @@ private[spark] class DAGScheduler(
runningStages -= stage
}
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r892518640
##
core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala:
##
@@ -2588,6 +2588,14 @@ private[spark] class DAGScheduler(
runningStages -= stage
}
cloud-fan commented on code in PR #36564:
URL: https://github.com/apache/spark/pull/36564#discussion_r892511172
##
core/src/main/scala/org/apache/spark/mapred/SparkHadoopMapRedUtil.scala:
##
@@ -76,6 +76,8 @@ object SparkHadoopMapRedUtil extends Logging {
if
AngersZh commented on code in PR #36786:
URL: https://github.com/apache/spark/pull/36786#discussion_r892496313
##
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:
##
@@ -140,7 +143,10 @@ private[hive] object
wangyum commented on code in PR #36786:
URL: https://github.com/apache/spark/pull/36786#discussion_r892493780
##
sql/hive-thriftserver/src/main/scala/org/apache/spark/sql/hive/thriftserver/SparkSQLCLIDriver.scala:
##
@@ -140,7 +143,10 @@ private[hive] object SparkSQLCLIDriver
huaxingao closed pull request #36805: [SPARK-39413][SQL] Capitalize sql
keywords in JDBCV2Suite
URL: https://github.com/apache/spark/pull/36805
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
huaxingao commented on PR #36805:
URL: https://github.com/apache/spark/pull/36805#issuecomment-1150004547
Merged to master. Thanks @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892459971
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -736,6 +737,24 @@ abstract class TypeCoercionBase {
}
}
+
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892455937
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
cloud-fan commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1149985883
> We always use master branch to release, no?
No, we use branch-3.3 to release 3.3.x
--
This is an automated message from the Apache Git Service.
To respond to the message,
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892098440
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892428863
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
MaxGekk commented on PR #36753:
URL: https://github.com/apache/spark/pull/36753#issuecomment-1149949544
@olaky This PR has been merged to branch-3.2 already, see
https://github.com/apache/spark/commit/d611d1f66761bd39fee850ca3f435027f9fc1e3c
Please, open separate PRs against
cxzl25 opened a new pull request, #36808:
URL: https://github.com/apache/spark/pull/36808
### What changes were proposed in this pull request?
Start `HadoopDelegationTokenManager` when `LocalSchedulerBackend` starts.
The behavior is similar to `CoarseGrainedSchedulerBackend` startup,
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892376424
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -736,6 +737,24 @@ abstract class TypeCoercionBase {
}
}
+
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892374424
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -736,6 +737,24 @@ abstract class TypeCoercionBase {
}
}
+
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892338094
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -736,6 +737,24 @@ abstract class TypeCoercionBase {
}
}
+
Ngone51 commented on code in PR #36512:
URL: https://github.com/apache/spark/pull/36512#discussion_r892317333
##
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##
@@ -933,46 +935,56 @@ private[spark] class BlockManager(
})
Some(new
Ngone51 commented on code in PR #36512:
URL: https://github.com/apache/spark/pull/36512#discussion_r892314812
##
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##
@@ -933,46 +935,56 @@ private[spark] class BlockManager(
})
Some(new
Ngone51 commented on code in PR #36512:
URL: https://github.com/apache/spark/pull/36512#discussion_r892313257
##
core/src/main/scala/org/apache/spark/storage/BlockManager.scala:
##
@@ -933,46 +935,56 @@ private[spark] class BlockManager(
})
Some(new
LuciferYang commented on PR #36807:
URL: https://github.com/apache/spark/pull/36807#issuecomment-1149862059
wait silencer upgrade
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
olaky commented on PR #36753:
URL: https://github.com/apache/spark/pull/36753#issuecomment-1149853529
@MaxGekk since you closed this, should I still work on propagating this to
3.1 and 3.0? And how should we deal with the test failures happening on
branch-3.2?
--
This is an automated
LuciferYang commented on PR #36807:
URL: https://github.com/apache/spark/pull/36807#issuecomment-1149850412
For test
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
LuciferYang opened a new pull request, #36807:
URL: https://github.com/apache/spark/pull/36807
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How was
MaxGekk commented on code in PR #35715:
URL: https://github.com/apache/spark/pull/35715#discussion_r892247619
##
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:
##
@@ -683,6 +683,41 @@ class AdaptiveQueryExecSuite
}
}
wayneguow commented on PR #36775:
URL: https://github.com/apache/spark/pull/36775#issuecomment-1149679199
+1 to @JoshRosen 's issue.
We also encountered this problem in our scenario for ETL. When reading
abnormally from filesystem(such as read timeout exception, which may succeeded
HyukjinKwon commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1149651712
let me backport just for sure in any event.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HyukjinKwon commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1149650782
We always use master branch to release, no?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892101500
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/TypeCoercion.scala:
##
@@ -806,6 +825,7 @@ abstract class TypeCoercionBase {
object TypeCoercion
EnricoMi commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r892098440
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/plans/logical/basicLogicalOperators.scala:
##
@@ -1227,6 +1227,49 @@ case class Pivot(
override
wwli05 opened a new pull request, #36806:
URL: https://github.com/apache/spark/pull/36806
### What changes were proposed in this pull request?
1. in spark-30502, it already support PeriodicRDDCheckpointer set the
checkpoint storage level , now in pregel, messageCheckpointer also
HeartSaVioR closed pull request #36801: [SPARK-39404][SS] Minor fix for
querying `_metadata` in streaming
URL: https://github.com/apache/spark/pull/36801
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HeartSaVioR commented on PR #36801:
URL: https://github.com/apache/spark/pull/36801#issuecomment-1149630507
Thanks! Merging to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
cloud-fan commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1149629636
shall we merge to 3.3 as well?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon closed pull request #36803: [SPARK-39411][BUILD] Fix release script
to address type hint in pyspark/version.py
URL: https://github.com/apache/spark/pull/36803
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
HyukjinKwon commented on PR #36803:
URL: https://github.com/apache/spark/pull/36803#issuecomment-1149604323
Merged to master.
The tests don't run this code path.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
beliefer commented on PR #36805:
URL: https://github.com/apache/spark/pull/36805#issuecomment-1149603804
ping @huaxingao cc @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
HyukjinKwon closed pull request #36802: [SPARK-39321][SQL][TESTS][FOLLOW-UP]
Respect CastWithAnsiOffSuite.ansiEnabled in 'cast string to date #2'
URL: https://github.com/apache/spark/pull/36802
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
HyukjinKwon commented on PR #36802:
URL: https://github.com/apache/spark/pull/36802#issuecomment-1149603537
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
beliefer opened a new pull request, #36805:
URL: https://github.com/apache/spark/pull/36805
### What changes were proposed in this pull request?
`JDBCV2Suite` exists some test case which uses sql keywords are not
capitalized.
This PR will capitalize sql keywords in `JDBCV2Suite`.
MaxGekk opened a new pull request, #36804:
URL: https://github.com/apache/spark/pull/36804
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
HeartSaVioR commented on code in PR #36704:
URL: https://github.com/apache/spark/pull/36704#discussion_r891998689
##
connector/kafka-0-10-sql/src/test/scala/org/apache/spark/sql/kafka010/KafkaMicroBatchSourceSuite.scala:
##
@@ -666,9 +667,10 @@ abstract class
HyukjinKwon closed pull request #36799: [SPARK-39350][SQL] Add flag to control
breaking change process for: DESC NAMESPACE EXTENDED should redact properties
URL: https://github.com/apache/spark/pull/36799
--
This is an automated message from the Apache Git Service.
To respond to the message,
HyukjinKwon commented on PR #36799:
URL: https://github.com/apache/spark/pull/36799#issuecomment-1149535232
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon opened a new pull request, #36803:
URL: https://github.com/apache/spark/pull/36803
### What changes were proposed in this pull request?
This PR proposes to address type hints `__version__: str` correctly in each
release. The type hint was added from Spark 3.3.0 at
huaxingao commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1149525380
I think over. I think it's better to have a separate PR to fix the explain
problem.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
MaxGekk closed pull request #36792: [SPARK-39392][SQL][3.3] Refine ANSI error
messages for try_* function hints
URL: https://github.com/apache/spark/pull/36792
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
MaxGekk commented on PR #36792:
URL: https://github.com/apache/spark/pull/36792#issuecomment-1149521109
+1, LGTM. Merging to 3.3.
Thank you, @vli-databricks and @HyukjinKwon for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
huaxingao commented on PR #36781:
URL: https://github.com/apache/spark/pull/36781#issuecomment-1149518253
The fix looks good but the explain result bothers me. Here is what I got
from the explain result:
```
spark.read.parquet(dir.getCanonicalPath).filter("isnotnull(f)").explain(true)
cxzl25 commented on code in PR #36787:
URL: https://github.com/apache/spark/pull/36787#discussion_r891950939
##
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/orc/OrcQuerySuite.scala:
##
@@ -832,6 +832,18 @@ abstract class OrcQuerySuite extends OrcQueryTest
cxzl25 commented on code in PR #36787:
URL: https://github.com/apache/spark/pull/36787#discussion_r891950939
##
sql/core/src/test/scala/org/apache/spark/sql/execution/datasources/orc/OrcQuerySuite.scala:
##
@@ -832,6 +832,18 @@ abstract class OrcQuerySuite extends OrcQueryTest
101 - 180 of 180 matches
Mail list logo