sadikovi commented on PR #36726:
URL: https://github.com/apache/spark/pull/36726#issuecomment-1141738160
@beliefer Can you review this PR from JDBC perspective? I think you have
contributed extensively to this part of the code. Also, cc @gengliangwang.
--
This is an automated message from
sadikovi commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r885267600
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -472,6 +481,15 @@ object JdbcUtils extends Logging with SQLConfHelper
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885265232
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -109,6 +116,13 @@ private[spark] class TaskSetManager(
private val executorDecommissionK
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885264474
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -80,12 +82,17 @@ private[spark] class TaskSetManager(
val copiesRunning = new Array[Int]
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885262650
##
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala:
##
@@ -103,6 +104,9 @@ private[spark] class TaskSchedulerImpl(
// of tasks that are very sh
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885252781
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
.
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885249138
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
.
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885249138
##
core/src/main/scala/org/apache/spark/internal/config/package.scala:
##
@@ -2073,6 +2073,41 @@ package object config {
.timeConf(TimeUnit.MILLISECONDS)
.
Ngone51 commented on code in PR #36162:
URL: https://github.com/apache/spark/pull/36162#discussion_r885247661
##
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala:
##
@@ -1217,6 +1289,61 @@ private[spark] class TaskSetManager(
def executorAdded(): Unit = {
LuciferYang commented on code in PR #36732:
URL: https://github.com/apache/spark/pull/36732#discussion_r885245947
##
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala:
##
@@ -413,7 +413,7 @@ class SQLAppStatusListener(
if (other != exec)
LuciferYang opened a new pull request, #36732:
URL: https://github.com/apache/spark/pull/36732
### What changes were proposed in this pull request?
This pr replace `filter(!condition)` with `filterNot(condition)` .
### Why are the changes needed?
Use appropriate api.
sunchao commented on PR #36721:
URL: https://github.com/apache/spark/pull/36721#issuecomment-1141704120
LGTM too, thanks @LuciferYang !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specifi
sunchao commented on code in PR #36697:
URL: https://github.com/apache/spark/pull/36697#discussion_r885234555
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2ScanPartitioning.scala:
##
@@ -32,15 +32,15 @@ import
org.apache.spark.util.collection.Utils.
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885232931
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/decimalExpressions.scala:
##
@@ -232,3 +216,33 @@ case class CheckOverflowInSum(
override p
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885232378
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -575,7 +707,31 @@ case class Divide(
override def symbol: String =
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885231906
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def decimalMetho
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885231224
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def decimalMetho
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885230648
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -521,6 +651,7 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885229342
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -490,10 +599,31 @@ trait DivModLike extends BinaryArithmetic {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885228623
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boole
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885227041
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boole
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885226749
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boole
weixiuli commented on code in PR #36724:
URL: https://github.com/apache/spark/pull/36724#discussion_r885224233
##
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/CleanupDynamicPruningFilters.scala:
##
@@ -54,7 +54,8 @@ object CleanupDynamicPruningFilters ex
weixiuli commented on code in PR #36724:
URL: https://github.com/apache/spark/pull/36724#discussion_r885224233
##
sql/core/src/main/scala/org/apache/spark/sql/execution/dynamicpruning/CleanupDynamicPruningFilters.scala:
##
@@ -54,7 +54,8 @@ object CleanupDynamicPruningFilters ex
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885221860
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boole
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885220637
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -422,9 +520,20 @@ case class Multiply(
override def symbol: String
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885220389
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def decimalMetho
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885220020
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -373,11 +457,24 @@ case class Subtract(
override def decimalMetho
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885219727
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -323,11 +390,27 @@ case class Add(
override def decimalMethod: St
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885218585
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -208,6 +210,76 @@ case class Abs(child: Expression, failOnError: Boole
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885218377
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -244,8 +311,7 @@ abstract class BinaryArithmetic extends BinaryOperato
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885213632
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala:
##
@@ -57,10 +53,13 @@ import org.apache.spark.sql.types._
* - LONG get
AngersZh opened a new pull request, #36731:
URL: https://github.com/apache/spark/pull/36731
### What changes were proposed in this pull request?
DescribeTableExec should redact properties
### Why are the changes needed?
DescribeTableExec should redact properties
JoshRosen commented on PR #36680:
URL: https://github.com/apache/spark/pull/36680#issuecomment-1141664634
I'll do a final sign off and merge mid-day tomorrow (Tuesday May 31st),
since today is a US holiday.
--
This is an automated message from the Apache Git Service.
To respond to the mes
AngersZh opened a new pull request, #36730:
URL: https://github.com/apache/spark/pull/36730
### What changes were proposed in this pull request?
ShowTablePropertiesCommand/ShowTablePropertiesExec should redact sensitive
properties.
### Why are the changes needed?
Show table
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885199407
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala:
##
@@ -60,7 +56,7 @@ import org.apache.spark.sql.types._
*/
// scala
LuciferYang commented on PR #36616:
URL: https://github.com/apache/spark/pull/36616#issuecomment-1141663964
cc @wangyum
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
cloud-fan commented on PR #36727:
URL: https://github.com/apache/spark/pull/36727#issuecomment-1141660590
Probably pyspark is broken in branch 3.2, cc @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
pan3793 commented on code in PR #36697:
URL: https://github.com/apache/spark/pull/36697#discussion_r885194978
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/v2/V2ScanPartitioning.scala:
##
@@ -32,15 +32,15 @@ import
org.apache.spark.util.collection.Utils.
wangyum commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r885194597
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -753,7 +760,10 @@ private[hive] class HiveClientImpl(
assert(s.values.fo
wangyum commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r885194296
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -518,7 +520,12 @@ private[hive] class HiveClientImpl(
createTime = h.getTT
huaxingao commented on PR #36644:
URL: https://github.com/apache/spark/pull/36644#issuecomment-1141651608
Thank you all!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
huaxingao commented on PR #36727:
URL: https://github.com/apache/spark/pull/36727#issuecomment-1141650961
Looks good. I am puzzled why the tests failed.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
LuciferYang commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r885186409
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -472,6 +481,15 @@ object JdbcUtils extends Logging with SQLConfHelp
LuciferYang commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r885186409
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JdbcUtils.scala:
##
@@ -472,6 +481,15 @@ object JdbcUtils extends Logging with SQLConfHelp
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885179955
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala:
##
@@ -60,7 +56,7 @@ import org.apache.spark.sql.types._
*/
// scalast
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885179844
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/DecimalPrecision.scala:
##
@@ -60,7 +56,7 @@ import org.apache.spark.sql.types._
*/
// scalast
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r885178644
##
sql/core/src/main/scala/org/apache/spark/sql/Melt.scala:
##
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contrib
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r885178329
##
sql/core/src/main/scala/org/apache/spark/sql/Melt.scala:
##
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contrib
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r885177916
##
sql/core/src/main/scala/org/apache/spark/sql/Melt.scala:
##
@@ -0,0 +1,143 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
+ * contrib
cloud-fan commented on code in PR #36150:
URL: https://github.com/apache/spark/pull/36150#discussion_r885175634
##
sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala:
##
@@ -2012,6 +2012,152 @@ class Dataset[T] private[sql](
@scala.annotation.varargs
def agg(expr:
beobest2 commented on PR #36729:
URL: https://github.com/apache/spark/pull/36729#issuecomment-1141632078
@HyukjinKwon The current 'supported API generation' function dynamically
compares the modules of `PySpark.pandas` and `pandas` to find the difference.
At this time, the inherited class i
zuston commented on PR #35683:
URL: https://github.com/apache/spark/pull/35683#issuecomment-1141631786
Thanks for @abhishekd0907 to submit this PR. I think this is a good
improvement for the Spark jobs' stability, especially in the scenario where we
implement elastic Yarn NMs on K8s.
ulysses-you commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885174282
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -323,11 +389,24 @@ case class Add(
override def decimalMethod:
cloud-fan commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r885172773
##
sql/core/src/test/resources/sql-tests/inputs/group-by.sql:
##
@@ -273,3 +273,16 @@ SELECT
FROM aggr
GROUP BY k
ORDER BY k;
+
+-- SPARK-39320: Add the MEDIAN() fu
cloud-fan commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r885172628
##
sql/core/src/test/resources/sql-tests/inputs/group-by.sql:
##
@@ -273,3 +273,16 @@ SELECT
FROM aggr
GROUP BY k
ORDER BY k;
+
+-- SPARK-39320: Add the MEDIAN() fu
cloud-fan commented on code in PR #36714:
URL: https://github.com/apache/spark/pull/36714#discussion_r885172461
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/percentiles.scala:
##
@@ -359,6 +359,32 @@ case class Percentile(
)
}
+// scal
beobest2 opened a new pull request, #36729:
URL: https://github.com/apache/spark/pull/36729
### What changes were proposed in this pull request?
The description provided in the supported pandas API list document or the
code comment needs improvement.
Also, there are ca
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r885169100
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/arithmetic.scala:
##
@@ -323,11 +389,24 @@ case class Add(
override def decimalMethod: St
beliefer commented on PR #36714:
URL: https://github.com/apache/spark/pull/36714#issuecomment-1141623396
ping @MaxGekk cc @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
zhengruifeng commented on PR #36660:
URL: https://github.com/apache/spark/pull/36660#issuecomment-1141607627
cc @HyukjinKwon I think this PR is ready
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
cloud-fan commented on PR #36722:
URL: https://github.com/apache/spark/pull/36722#issuecomment-1141592045
thanks, merging to master/3.3!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specif
cloud-fan closed pull request #36722: [SPARK-39335][SQL] DescribeTableCommand
should redact properties
URL: https://github.com/apache/spark/pull/36722
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
HyukjinKwon closed pull request #36640: [SPARK-39262][PYTHON] Correct the
behavior of creating DataFrame from an RDD
URL: https://github.com/apache/spark/pull/36640
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #36640:
URL: https://github.com/apache/spark/pull/36640#issuecomment-1141576890
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
github-actions[bot] closed pull request #35379: [SPARK-38091][SQL] fix bugs in
AvroSerializer
URL: https://github.com/apache/spark/pull/35379
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
sadikovi commented on code in PR #36726:
URL: https://github.com/apache/spark/pull/36726#discussion_r885115376
##
sql/core/src/main/scala/org/apache/spark/sql/execution/datasources/jdbc/JDBCOptions.scala:
##
@@ -226,6 +226,9 @@ class JDBCOptions(
// The prefix that is added t
AngersZh commented on PR #36564:
URL: https://github.com/apache/spark/pull/36564#issuecomment-1141543181
ping @cloud-fan Could you take a look?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
sadikovi commented on PR #36672:
URL: https://github.com/apache/spark/pull/36672#issuecomment-1141516421
Also, can we add a test to check that the DEFAULT values work? Thanks.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub an
sadikovi commented on code in PR #36672:
URL: https://github.com/apache/spark/pull/36672#discussion_r885096588
##
sql/core/src/main/java/org/apache/spark/sql/execution/datasources/parquet/VectorizedParquetRecordReader.java:
##
@@ -270,13 +271,40 @@ private void initBatch(
dongjoon-hyun closed pull request #36728: [SPARK-39341][K8S]
KubernetesExecutorBackend should allow IPv6 pod IP
URL: https://github.com/apache/spark/pull/36728
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
williamhyun opened a new pull request, #36728:
URL: https://github.com/apache/spark/pull/36728
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
AmplabJenkins commented on PR #36716:
URL: https://github.com/apache/spark/pull/36716#issuecomment-1141366692
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
AmplabJenkins commented on PR #36717:
URL: https://github.com/apache/spark/pull/36717#issuecomment-114136
Can one of the admins verify this patch?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun commented on code in PR #36723:
URL: https://github.com/apache/spark/pull/36723#discussion_r884993022
##
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala:
##
@@ -147,10 +147,10 @@ class DataSourceV2SQLSuite
Array("", "", ""),
dongjoon-hyun commented on code in PR #36723:
URL: https://github.com/apache/spark/pull/36723#discussion_r884993022
##
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala:
##
@@ -147,10 +147,10 @@ class DataSourceV2SQLSuite
Array("", "", ""),
cloud-fan commented on PR #36727:
URL: https://github.com/apache/spark/pull/36727#issuecomment-1141335784
cc @huaxingao @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific com
cloud-fan opened a new pull request, #36727:
URL: https://github.com/apache/spark/pull/36727
### What changes were proposed in this pull request?
In the first version of DS v2 aggregate pushdown, we don't want to support
nested fields and we picked `PushableColumnWithoutNested
xinrong-databricks commented on code in PR #36640:
URL: https://github.com/apache/spark/pull/36640#discussion_r884941255
##
python/pyspark/sql/session.py:
##
@@ -611,8 +611,8 @@ def _inferSchema(
:class:`pyspark.sql.types.StructType`
"""
first = rdd.fi
cloud-fan commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884916407
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Optio
cloud-fan commented on code in PR #36708:
URL: https://github.com/apache/spark/pull/36708#discussion_r884912975
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/linearRegression.scala:
##
@@ -291,3 +291,52 @@ case class RegrSlope(left: Expressio
cxzl25 commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884911323
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Option]]
cloud-fan commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884900931
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Optio
cloud-fan commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884899219
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Optio
cxzl25 commented on code in PR #32959:
URL: https://github.com/apache/spark/pull/32959#discussion_r884894581
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/DateTimeUtils.scala:
##
@@ -518,44 +516,55 @@ object DateTimeUtils {
* The return type is [[Option]]
cloud-fan commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r884892701
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:
##
@@ -842,10 +843,11 @@ private[spark] class HiveExternalCatalog(conf: SparkConf,
had
cloud-fan commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r884892701
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveExternalCatalog.scala:
##
@@ -842,10 +843,11 @@ private[spark] class HiveExternalCatalog(conf: SparkConf,
had
cloud-fan commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r884891864
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -753,7 +760,10 @@ private[hive] class HiveClientImpl(
assert(s.values.
cloud-fan commented on code in PR #36625:
URL: https://github.com/apache/spark/pull/36625#discussion_r884890053
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/client/HiveClientImpl.scala:
##
@@ -518,7 +520,12 @@ private[hive] class HiveClientImpl(
createTime = h.get
cloud-fan commented on code in PR #36593:
URL: https://github.com/apache/spark/pull/36593#discussion_r884887211
##
sql/core/src/main/scala/org/apache/spark/sql/jdbc/JdbcDialects.scala:
##
@@ -304,6 +310,11 @@ abstract class JdbcDialect extends Serializable with
Logging{
*/
cloud-fan commented on code in PR #36593:
URL: https://github.com/apache/spark/pull/36593#discussion_r884880249
##
sql/catalyst/src/main/java/org/apache/spark/sql/connector/util/V2ExpressionSQLBuilder.java:
##
@@ -241,6 +246,11 @@ protected String visitSQLFunction(String funcNam
cloud-fan commented on code in PR #36593:
URL: https://github.com/apache/spark/pull/36593#discussion_r884875139
##
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/aggregate/UserDefinedAggregateFunc.java:
##
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache So
cloud-fan commented on code in PR #36593:
URL: https://github.com/apache/spark/pull/36593#discussion_r884874417
##
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/aggregate/UserDefinedAggregateFunc.java:
##
@@ -0,0 +1,67 @@
+/*
+ * Licensed to the Apache So
cloud-fan commented on code in PR #36593:
URL: https://github.com/apache/spark/pull/36593#discussion_r884872763
##
sql/catalyst/src/main/java/org/apache/spark/sql/connector/expressions/GeneralScalarExpression.java:
##
@@ -235,8 +235,8 @@ public String toString() {
try {
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r884783994
##
sql/core/src/test/resources/tpcds-plan-stability/approved-plans-modified/q65.sf100/explain.txt:
##
@@ -151,106 +152,110 @@ Functions [1]: [avg(revenue#21)]
Aggrega
cloud-fan commented on code in PR #36698:
URL: https://github.com/apache/spark/pull/36698#discussion_r884783483
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/aggregate/Average.scala:
##
@@ -75,18 +80,17 @@ abstract class AverageBase
)
protected
AngersZh commented on code in PR #36723:
URL: https://github.com/apache/spark/pull/36723#discussion_r884771145
##
sql/core/src/test/scala/org/apache/spark/sql/connector/DataSourceV2SQLSuite.scala:
##
@@ -147,10 +147,10 @@ class DataSourceV2SQLSuite
Array("", "", ""),
1104056452 commented on PR #36447:
URL: https://github.com/apache/spark/pull/36447#issuecomment-1141086939
Can someone help review this patch?thanks
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
1104056452 commented on PR #36726:
URL: https://github.com/apache/spark/pull/36726#issuecomment-1141086440
Can someone help review this patch?thanks
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
cloud-fan commented on PR #27924:
URL: https://github.com/apache/spark/pull/27924#issuecomment-1141078095
I'm fine to keep backward compatible with some "by-accident" features if the
cost is small. Feel free to open a PR to bring back the old behavior if
1. we do not re-introduce the corr
1 - 100 of 155 matches
Mail list logo