panbingkun opened a new pull request, #39218:
URL: https://github.com/apache/spark/pull/39218
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
zhengruifeng closed pull request #39209: [SPARK-41703][CONNECT][PYTHON] Combine
NullType and typed_null in Literal
URL: https://github.com/apache/spark/pull/39209
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL ab
zhengruifeng commented on PR #39209:
URL: https://github.com/apache/spark/pull/39209#issuecomment-1364986470
merged into master, thanks @HyukjinKwon for the reivews
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
LuciferYang commented on code in PR #39202:
URL: https://github.com/apache/spark/pull/39202#discussion_r1057124135
##
core/src/main/scala/org/apache/spark/internal/config/History.scala:
##
@@ -79,6 +79,21 @@ private[spark] object History {
.stringConf
.createOptional
LuciferYang commented on PR #39202:
URL: https://github.com/apache/spark/pull/39202#issuecomment-1364993175
> Make it possible for SHS to read the live UI rocksdb instance.
Will this be supported in the future or has it been supported now?
--
This is an automated message fro
LuciferYang commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057134901
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,111 @@ case class ArrayExcept(left: Express
HyukjinKwon commented on PR #39210:
URL: https://github.com/apache/spark/pull/39210#issuecomment-1364999051
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon closed pull request #39210: [SPARK-41704][BUILD] Upgrade
`sbt-assembly` from 2.0.0 to 2.1.0
URL: https://github.com/apache/spark/pull/39210
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
uzadude opened a new pull request, #39219:
URL: https://github.com/apache/spark/pull/39219
### What changes were proposed in this pull request?
This PR proposes to auto-infer bucketing information from actions that
contain a shuffle.
### Why are the changes needed?
Seems
zhengruifeng commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057139556
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,133 @@ case class ArrayExcept(left: Expres
ulysses-you opened a new pull request, #39220:
URL: https://github.com/apache/spark/pull/39220
### What changes were proposed in this pull request?
This pr aims to make ctas use a nested execution instead of running data
writing cmmand.
So, we can clean up ctas itself t
ulysses-you commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057141344
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,27 +141,7 @@ case class CreateDataSourceTableAsSelectComm
beliefer commented on PR #39213:
URL: https://github.com/apache/spark/pull/39213#issuecomment-1365006353
The failure GA is unrelated to this PR.
ping @HyukjinKwon @grundprinzip @zhengruifeng @amaliujia
--
This is an automated message from the Apache Git Service.
To respond to the messa
beliefer commented on PR #38799:
URL: https://github.com/apache/spark/pull/38799#issuecomment-1365007428
ping @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
zhengruifeng commented on PR #38867:
URL: https://github.com/apache/spark/pull/38867#issuecomment-1365013704
@Daniel-Davies Sorry for the late reply.
On `Input item parameter is null:`
The issue of NULL handling was controversial, and we spend some time to
discuss with some SQL e
zhengruifeng commented on code in PR #38947:
URL: https://github.com/apache/spark/pull/38947#discussion_r1057148245
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala:
##
@@ -1840,6 +1840,47 @@ class CollectionExpressionsSui
itholic commented on PR #39137:
URL: https://github.com/apache/spark/pull/39137#issuecomment-1365018902
Thanks @grundprinzip for the review.
I agree that your comments and feel it's pretty reasonable.
Actually, I once submitted a PR that implemented the framework on
PySpark-side (h
HyukjinKwon commented on code in PR #39214:
URL: https://github.com/apache/spark/pull/39214#discussion_r1057150277
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -99,6 +100,47 @@ class SparkConnectPlanner(session:
HyukjinKwon commented on code in PR #39214:
URL: https://github.com/apache/spark/pull/39214#discussion_r1057150277
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -99,6 +100,47 @@ class SparkConnectPlanner(session:
zhengruifeng commented on PR #38867:
URL: https://github.com/apache/spark/pull/38867#issuecomment-1365028273
also cc @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific commen
lyy-pineapple commented on PR #38171:
URL: https://github.com/apache/spark/pull/38171#issuecomment-1365029419
> https://user-images.githubusercontent.com/8748814/204439049-53f0bd4f-9ea0-4289-8268-d16aef5b4334.png";>
>
> @lyy-pineapple Would you share the test sql pattern? I test some c
zhengruifeng commented on PR #38947:
URL: https://github.com/apache/spark/pull/38947#issuecomment-1365029531
also cc @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific commen
LuciferYang commented on PR #38947:
URL: https://github.com/apache/spark/pull/38947#issuecomment-1365031813
@navinvishy cloud you resolve the conflict?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL ab
LuciferYang commented on PR #38867:
URL: https://github.com/apache/spark/pull/38867#issuecomment-1365032081
@Daniel-Davies could you resolve the conflict?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
zhengruifeng commented on code in PR #38874:
URL: https://github.com/apache/spark/pull/38874#discussion_r1057161110
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,51 @@ case class ArrayExcept(left: Express
lyy-pineapple commented on code in PR #38171:
URL: https://github.com/apache/spark/pull/38171#discussion_r1057161457
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/regexpExpressionsJoni.scala:
##
@@ -0,0 +1,471 @@
+/*
+ * Licensed to the Apache Software
zhengruifeng closed pull request #39216: [SPARK-41710][CONNECT][PYTHON]
Implement `Column.between`
URL: https://github.com/apache/spark/pull/39216
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
zhengruifeng commented on PR #39216:
URL: https://github.com/apache/spark/pull/39216#issuecomment-1365044421
merged into master, thanks @HyukjinKwon for the reviews
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
U
LuciferYang commented on code in PR #39215:
URL: https://github.com/apache/spark/pull/39215#discussion_r1057167898
##
core/src/main/scala/org/apache/spark/status/api/v1/api.scala:
##
@@ -461,7 +461,7 @@ class ApplicationEnvironmentInfo private[spark] (
val systemProperties:
LuciferYang commented on code in PR #39215:
URL: https://github.com/apache/spark/pull/39215#discussion_r1057167898
##
core/src/main/scala/org/apache/spark/status/api/v1/api.scala:
##
@@ -461,7 +461,7 @@ class ApplicationEnvironmentInfo private[spark] (
val systemProperties:
codecov-commenter commented on PR #39215:
URL: https://github.com/apache/spark/pull/39215#issuecomment-1365070569
#
[Codecov](https://codecov.io/gh/apache/spark/pull/39215?src=pr&el=h1&utm_medium=referral&utm_source=github&utm_content=comment&utm_campaign=pr+comments&utm_term=The+Apache+Soft
grundprinzip commented on PR #39137:
URL: https://github.com/apache/spark/pull/39137#issuecomment-1365099279
> * I worried that maybe it would not be easy to maintenance when the rules
on one side (PySpark vs JVM) were arbitrarily changed in the future. So, I
wanted to manage all errors in
infoankitp commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057196837
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala:
##
@@ -2596,4 +2596,113 @@ class CollectionExpressionsSuit
infoankitp commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057197385
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,133 @@ case class ArrayExcept(left: Expressi
HyukjinKwon commented on code in PR #39214:
URL: https://github.com/apache/spark/pull/39214#discussion_r1057211943
##
python/pyspark/sql/connect/session.py:
##
@@ -120,6 +121,11 @@ def __init__(self, connectionString: str, userId:
Optional[str] = None):
# Parse the con
HyukjinKwon commented on code in PR #39214:
URL: https://github.com/apache/spark/pull/39214#discussion_r1057212456
##
python/pyspark/sql/tests/test_catalog.py:
##
@@ -16,20 +16,21 @@
#
Review Comment:
The changes made in reused PySpark tests are two:
1. Do not check
HyukjinKwon commented on PR #39214:
URL: https://github.com/apache/spark/pull/39214#issuecomment-1365124966
Should be ready for a look. cc @amaliujia @zhengruifeng @hvanhovell
@grundprinzip
--
This is an automated message from the Apache Git Service.
To respond to the message, please log
HyukjinKwon commented on code in PR #39214:
URL: https://github.com/apache/spark/pull/39214#discussion_r1057211943
##
python/pyspark/sql/connect/session.py:
##
@@ -120,6 +121,11 @@ def __init__(self, connectionString: str, userId:
Optional[str] = None):
# Parse the con
cloud-fan commented on code in PR #39133:
URL: https://github.com/apache/spark/pull/39133#discussion_r1057233265
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/trees/TreePatterns.scala:
##
@@ -134,6 +134,7 @@ object TreePattern extends Enumeration {
val UNRESOL
infoankitp commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057236664
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,111 @@ case class ArrayExcept(left: Expressi
infoankitp commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057236944
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/collectionOperations.scala:
##
@@ -4600,3 +4600,111 @@ case class ArrayExcept(left: Expressi
bjornjorgensen commented on PR #39196:
URL: https://github.com/apache/spark/pull/39196#issuecomment-1365190148
My intentions are to explain to a new contributor how I do it and what tools
I use. Sonar is built on best practice rules, the problem is that not
everything hits equally well. Her
shrprasa opened a new pull request, #39221:
URL: https://github.com/apache/spark/pull/39221
### What changes were proposed in this pull request?
In SSLOptions rest of the settings should be set only when ssl is enabled.
### Why are the changes needed?
If ${ns}.enabled is fals
cloud-fan opened a new pull request, #39222:
URL: https://github.com/apache/spark/pull/39222
### What changes were proposed in this pull request?
It's a bit confusing to have both `UnresolvedFunc` and `UnresolvedFunction`.
This PR renames `UnresolvedFunc` to `UnresolvedFunctio
cloud-fan commented on PR #39222:
URL: https://github.com/apache/spark/pull/39222#issuecomment-1365195933
cc @rxin @viirya
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon commented on PR #39214:
URL: https://github.com/apache/spark/pull/39214#issuecomment-1365200657
This PR supports all as the same as the current PySpark does for now.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
HyukjinKwon opened a new pull request, #39223:
URL: https://github.com/apache/spark/pull/39223
### What changes were proposed in this pull request?
TBD
### Why are the changes needed?
TBD
### Does this PR introduce _any_ user-facing change?
TBD
### How wa
bjornjorgensen commented on code in PR #39180:
URL: https://github.com/apache/spark/pull/39180#discussion_r1057246975
##
python/pyspark/sql/connect/window.py:
##
@@ -306,263 +217,27 @@ class Window:
@staticmethod
def partitionBy(*cols: Union["ColumnOrName", List["Col
beliefer commented on code in PR #38865:
URL: https://github.com/apache/spark/pull/38865#discussion_r1057262332
##
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/expressions/CollectionExpressionsSuite.scala:
##
@@ -2596,4 +2596,113 @@ class CollectionExpressionsSuite
srowen commented on code in PR #39215:
URL: https://github.com/apache/spark/pull/39215#discussion_r1057328506
##
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala:
##
@@ -486,7 +486,7 @@ private class LiveExecutionData(val executionId: Long)
e
Daniel-Davies commented on PR #38867:
URL: https://github.com/apache/spark/pull/38867#issuecomment-1365494699
@zhengruifeng Great direction provided by your message above, thank you.
Thank you very much also for consulting with experts to resolve the conflict of
ideas. I'll implement the be
HyukjinKwon commented on PR #39214:
URL: https://github.com/apache/spark/pull/39214#issuecomment-1365495742
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon closed pull request #39214: [SPARK-41707][CONNECT] Implement
Catalog API in Spark Connect
URL: https://github.com/apache/spark/pull/39214
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
HyukjinKwon commented on PR #39217:
URL: https://github.com/apache/spark/pull/39217#issuecomment-1365501507
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon closed pull request #39217: [SPARK-41711][BUILD] Upgrade
protobuf-java to 3.21.12
URL: https://github.com/apache/spark/pull/39217
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
HyukjinKwon commented on PR #39222:
URL: https://github.com/apache/spark/pull/39222#issuecomment-1365502129
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon closed pull request #39222: [SPARK-41720][SQL] Rename
UnresolvedFunc to UnresolvedFunctionName
URL: https://github.com/apache/spark/pull/39222
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to g
HyukjinKwon commented on code in PR #39180:
URL: https://github.com/apache/spark/pull/39180#discussion_r1057361102
##
python/pyspark/sql/connect/window.py:
##
@@ -306,263 +217,27 @@ class Window:
@staticmethod
def partitionBy(*cols: Union["ColumnOrName", List["Column
github-actions[bot] closed pull request #37721: [SPARK-40272][CORE]Support
service port custom with range
URL: https://github.com/apache/spark/pull/37721
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
github-actions[bot] commented on PR #37910:
URL: https://github.com/apache/spark/pull/37910#issuecomment-1365517957
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #37625:
URL: https://github.com/apache/spark/pull/37625#issuecomment-1365517972
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
HyukjinKwon commented on PR #39223:
URL: https://github.com/apache/spark/pull/39223#issuecomment-1365522709
cc @amaliujia @zhengruifeng @grundprinzip FYI
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HyukjinKwon opened a new pull request, #39224:
URL: https://github.com/apache/spark/pull/39224
### What changes were proposed in this pull request?
This PR proposes to enable doctests in `pyspark.sql.connect.catalog` that is
virtually the same as `pyspark.sql.catalog`.
### Why
HyukjinKwon commented on PR #39223:
URL: https://github.com/apache/spark/pull/39223#issuecomment-1365536717
Related tests passed.
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above t
HyukjinKwon closed pull request #39223: [SPARK-41717][CONNECT] Deduplicate
print and _repr_html_ at LogicalPlan
URL: https://github.com/apache/spark/pull/39223
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
techaddict opened a new pull request, #39225:
URL: https://github.com/apache/spark/pull/39225
### What changes were proposed in this pull request?
This PR proposes to enable doctests in pyspark.sql.connect.window that is
virtually the same as pyspark.sql.window.
### Why are the cha
techaddict commented on code in PR #39225:
URL: https://github.com/apache/spark/pull/39225#discussion_r1057398834
##
python/pyspark/sql/connect/window.py:
##
@@ -201,7 +201,7 @@ def __repr__(self) -> str:
return "WindowSpec(" + ", ".join(strs) + ")"
-WindowSpec.__do
techaddict commented on code in PR #39225:
URL: https://github.com/apache/spark/pull/39225#discussion_r1057399255
##
python/pyspark/sql/connect/window.py:
##
@@ -218,27 +218,201 @@ class Window:
@staticmethod
def partitionBy(*cols: Union["ColumnOrName", List["ColumnO
techaddict commented on PR #39225:
URL: https://github.com/apache/spark/pull/39225#issuecomment-1365550356
@HyukjinKwon can you take a look ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the s
LuciferYang commented on PR #39217:
URL: https://github.com/apache/spark/pull/39217#issuecomment-1365557057
Thanks @srowen @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specif
LuciferYang commented on code in PR #39215:
URL: https://github.com/apache/spark/pull/39215#discussion_r1057406919
##
sql/core/src/main/scala/org/apache/spark/sql/execution/ui/SQLAppStatusListener.scala:
##
@@ -486,7 +486,7 @@ private class LiveExecutionData(val executionId: Lon
ulysses-you commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057141344
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,27 +141,7 @@ case class CreateDataSourceTableAsSelectComm
ulysses-you commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057410190
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,29 +141,9 @@ case class CreateDataSourceTableAsSelectComm
ulysses-you commented on PR #39220:
URL: https://github.com/apache/spark/pull/39220#issuecomment-1365567962
cc @cloud-fan
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
T
LuciferYang opened a new pull request, #39226:
URL: https://github.com/apache/spark/pull/39226
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
zhengruifeng opened a new pull request, #39227:
URL: https://github.com/apache/spark/pull/39227
### What changes were proposed in this pull request?
Implement 3 missing time window functions
### Why are the changes needed?
For API coverage
after this PR, following one
zhengruifeng commented on code in PR #39227:
URL: https://github.com/apache/spark/pull/39227#discussion_r1057415979
##
connector/connect/server/src/main/scala/org/apache/spark/sql/connect/planner/SparkConnectPlanner.scala:
##
@@ -691,6 +691,46 @@ class SparkConnectPlanner(sessio
cloud-fan commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057416904
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,29 +141,9 @@ case class CreateDataSourceTableAsSelectComman
cloud-fan commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057417702
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/CreateHiveTableAsSelectCommand.scala:
##
@@ -21,43 +21,23 @@ import scala.util.control.NonFatal
impo
cloud-fan commented on code in PR #39182:
URL: https://github.com/apache/spark/pull/39182#discussion_r1057418565
##
connector/connect/common/src/main/protobuf/spark/connect/relations.proto:
##
@@ -378,10 +379,6 @@ message Sample {
// (Optional) The random seed.
optional
LuciferYang commented on PR #39226:
URL: https://github.com/apache/spark/pull/39226#issuecomment-1365582194
cc @gengliangwang
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment
LuciferYang commented on code in PR #39226:
URL: https://github.com/apache/spark/pull/39226#discussion_r1057419363
##
core/src/main/scala/org/apache/spark/status/AppStatusStore.scala:
##
@@ -36,7 +36,8 @@ import org.apache.spark.util.kvstore.KVStore
*/
private[spark] class Ap
cloud-fan commented on PR #39186:
URL: https://github.com/apache/spark/pull/39186#issuecomment-1365585284
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific c
cloud-fan closed pull request #39186: [SPARK-41690][SQL][CONNECT] Agnostic
Encoders
URL: https://github.com/apache/spark/pull/39186
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comme
ulysses-you commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057422003
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,29 +141,9 @@ case class CreateDataSourceTableAsSelectComm
zhengruifeng opened a new pull request, #39228:
URL: https://github.com/apache/spark/pull/39228
### What changes were proposed in this pull request?
Implement `sequence` function
### Why are the changes needed?
for API coverage
### Does this PR introduce _any_ user-fac
zhengruifeng commented on PR #39227:
URL: https://github.com/apache/spark/pull/39227#issuecomment-1365586515
cc @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
ulysses-you commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057422467
##
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/CreateHiveTableAsSelectCommand.scala:
##
@@ -21,43 +21,23 @@ import scala.util.control.NonFatal
im
zhengruifeng commented on PR #39228:
URL: https://github.com/apache/spark/pull/39228#issuecomment-1365587203
cc @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
LuciferYang commented on PR #38874:
URL: https://github.com/apache/spark/pull/38874#issuecomment-1365587749
Should we merge this one first? The other 3 new functions may need to
resolve conflicts after this merge.
--
This is an automated message from the Apache Git Service.
To res
cloud-fan commented on code in PR #39099:
URL: https://github.com/apache/spark/pull/39099#discussion_r1057423980
##
sql/catalyst/src/test/scala/org/apache/spark/sql/types/DecimalSuite.scala:
##
@@ -384,4 +387,51 @@ class DecimalSuite extends SparkFunSuite with
PrivateMethodTest
cloud-fan commented on code in PR #39099:
URL: https://github.com/apache/spark/pull/39099#discussion_r1057424394
##
sql/catalyst/src/main/scala/org/apache/spark/sql/types/Decimal.scala:
##
@@ -374,7 +374,7 @@ final class Decimal extends Ordered[Decimal] with
Serializable {
amaliujia closed pull request #38807: [SPARK-41270][CONNECT] Add Catalog
tableExists and databaseExists in Connect proto
URL: https://github.com/apache/spark/pull/38807
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
zhengruifeng opened a new pull request, #39229:
URL: https://github.com/apache/spark/pull/39229
### What changes were proposed in this pull request?
Implement `format_number` function
### Why are the changes needed?
for API coverage
### Does this PR introduce _
cloud-fan commented on PR #39062:
URL: https://github.com/apache/spark/pull/39062#issuecomment-1365589357
cc @beliefer
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To u
cloud-fan commented on PR #38874:
URL: https://github.com/apache/spark/pull/38874#issuecomment-1365589473
thanks, merging to master!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific c
zhengruifeng commented on code in PR #39229:
URL: https://github.com/apache/spark/pull/39229#discussion_r1057425058
##
python/pyspark/sql/connect/functions.py:
##
@@ -1629,11 +1629,11 @@ def encode(col: "ColumnOrName", charset: str) -> Column:
encode.__doc__ = pysparkfuncs.enco
cloud-fan closed pull request #38874: [SPARK-41235][SQL][PYTHON]High-order
function: array_compact implementation
URL: https://github.com/apache/spark/pull/38874
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL abo
cloud-fan commented on code in PR #39220:
URL: https://github.com/apache/spark/pull/39220#discussion_r1057425206
##
sql/core/src/main/scala/org/apache/spark/sql/execution/command/createDataSourceTables.scala:
##
@@ -143,29 +141,9 @@ case class CreateDataSourceTableAsSelectComman
zhengruifeng commented on PR #39229:
URL: https://github.com/apache/spark/pull/39229#issuecomment-1365589844
cc @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
1 - 100 of 166 matches
Mail list logo