LuciferYang commented on PR #42480:
URL: https://github.com/apache/spark/pull/42480#issuecomment-1676775356
[5a92ee2](https://github.com/apache/spark/pull/42480/commits/5a92ee2fea3eed657f34e846c1a5d708c097f461)
revert SPARK-44705 for test Java 17
--
This is an automated message from the
cloud-fan commented on PR #42482:
URL: https://github.com/apache/spark/pull/42482#issuecomment-1676763466
cc @aokolnychyi @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the spec
cloud-fan opened a new pull request, #42482:
URL: https://github.com/apache/spark/pull/42482
### What changes were proposed in this pull request?
This is a followup of https://github.com/apache/spark/pull/41448 . As an
optimizer rule, the produced plan should be resolved and r
cloud-fan commented on code in PR #42467:
URL: https://github.com/apache/spark/pull/42467#discussion_r1293027327
##
docs/sql-ref-syntax-aux-set-var.md:
##
@@ -0,0 +1,98 @@
+---
+layout: global
+title: SET VAR
+displayTitle: SET VAR
+license: |
+ Licensed to the Apache Software
cloud-fan commented on code in PR #42467:
URL: https://github.com/apache/spark/pull/42467#discussion_r1293027030
##
docs/sql-ref-syntax-aux-set-var.md:
##
@@ -0,0 +1,98 @@
+---
+layout: global
+title: SET VAR
+displayTitle: SET VAR
+license: |
+ Licensed to the Apache Software
yaooqinn opened a new pull request, #42481:
URL: https://github.com/apache/spark/pull/42481
### What changes were proposed in this pull request?
This PR wraps the catch-block with a new execution id to
QueryExecution.assertAnalyzed. It will reuse `SQLExecution.with
cloud-fan commented on code in PR #42467:
URL: https://github.com/apache/spark/pull/42467#discussion_r1293025786
##
docs/sql-ref-syntax-ddl-declare-variable.md:
##
@@ -0,0 +1,82 @@
+---
+layout: global
+title: DECLARE VARIABLE
+displayTitle: DECLARE VARIABLE
+license: |
+ Licen
LuciferYang opened a new pull request, #42480:
URL: https://github.com/apache/spark/pull/42480
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
wangyum commented on PR #42474:
URL: https://github.com/apache/spark/pull/42474#issuecomment-1676714192
Thanks. Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comm
wangyum closed pull request #42474: [SPARK-44792][BUILD] Upgrade curator to
5.2.0
URL: https://github.com/apache/spark/pull/42474
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment
LuciferYang commented on PR #42385:
URL: https://github.com/apache/spark/pull/42385#issuecomment-1676690789
This PR caused the failure of the Scala 2.13 mima check.
https://github.com/apache/spark/pull/42479
--
This is an automated message from the Apache Git Service.
To respond to the me
grundprinzip commented on code in PR #42478:
URL: https://github.com/apache/spark/pull/42478#discussion_r1292973359
##
sql/api/src/main/scala/org/apache/spark/sql/catalyst/encoders/OuterScopes.scala:
##
@@ -26,28 +26,9 @@ import org.apache.spark.util.SparkClassUtils
object Ou
LuciferYang opened a new pull request, #42479:
URL: https://github.com/apache/spark/pull/42479
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
LuciferYang opened a new pull request, #42477:
URL: https://github.com/apache/spark/pull/42477
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
panbingkun commented on PR #42425:
URL: https://github.com/apache/spark/pull/42425#issuecomment-1676614757
Alternatively, the following is also a solution, but the server
configuration needs to be modified.
ref:
https://developers.google.com/search/docs/crawling-indexing/consolidate-dupl
xiaoa6435 commented on code in PR #42431:
URL: https://github.com/apache/spark/pull/42431#discussion_r1292926346
##
mllib/src/main/scala/org/apache/spark/mllib/stat/correlation/SpearmanCorrelation.scala:
##
@@ -65,8 +65,8 @@ private[stat] object SpearmanCorrelation extends Corre
ukby1234 commented on code in PR #42296:
URL: https://github.com/apache/spark/pull/42296#discussion_r1292923823
##
core/src/main/scala/org/apache/spark/MapOutputTracker.scala:
##
@@ -1288,6 +1288,30 @@ private[spark] class MapOutputTrackerWorker(conf:
SparkConf) extends MapOutp
ukby1234 commented on code in PR #42296:
URL: https://github.com/apache/spark/pull/42296#discussion_r1292923823
##
core/src/main/scala/org/apache/spark/MapOutputTracker.scala:
##
@@ -1288,6 +1288,30 @@ private[spark] class MapOutputTrackerWorker(conf:
SparkConf) extends MapOutp
hvanhovell commented on PR #42476:
URL: https://github.com/apache/spark/pull/42476#issuecomment-1676591099
@bogao007 PTAL
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
zhengruifeng commented on PR #42451:
URL: https://github.com/apache/spark/pull/42451#issuecomment-1676590123
thanks, merged to master and branch-3.5
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go t
zhengruifeng closed pull request #42451: [SPARK-44775][PYTHON][DOCS] Add
missing version information in DataFrame APIs
URL: https://github.com/apache/spark/pull/42451
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
UR
hvanhovell opened a new pull request, #42476:
URL: https://github.com/apache/spark/pull/42476
### What changes were proposed in this pull request?
When you try to run a streaming query from the REPL for example:
```scala
val add1 = udf((i: Long) => i + 1)
val query = spark.readStr
advancedxy commented on PR #42255:
URL: https://github.com/apache/spark/pull/42255#issuecomment-1676584021
Gently ping @cloud-fan @HyukjinKwon
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
srowen commented on PR #42469:
URL: https://github.com/apache/spark/pull/42469#issuecomment-1676570724
It should be in both places.
It's interesting. The typical CONTRIBUTING.md text that we have says, "When
you contribute code, you affirm that the contribution is your original work",
yaooqinn commented on PR #42462:
URL: https://github.com/apache/spark/pull/42462#issuecomment-1676567114
Thanks for the explanation @HyukjinKwon. I'm OK with it if we already have
precedents like arvo and csv
--
This is an automated message from the Apache Git Service.
To respond to the m
panbingkun commented on PR #42425:
URL: https://github.com/apache/spark/pull/42425#issuecomment-1676556063
> Here is an example of a documentation page for a specific version:
https://spark.apache.org/docs/3.1.3/api/python/reference/api/pyspark.sql.DataFrame.withColumn.html
>
> This i
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292893433
##
resource-managers/yarn/src/test/scala/org/apache/spark/deploy/yarn/ClientSuite.scala:
##
@@ -666,6 +666,42 @@ class ClientSuite extends SparkFunSuite with Matchers
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292892702
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:
##
@@ -485,7 +534,12 @@ private[spark] class Client(
val localResources = Has
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292891976
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:
##
@@ -458,6 +461,52 @@ private[spark] class Client(
new Path(resolvedDestDir
zhengruifeng commented on PR #42469:
URL: https://github.com/apache/spark/pull/42469#issuecomment-1676521158
I guess this should be documented in
https://spark.apache.org/developer-tools.html instead of PR template?
cc @HyukjinKwon @gatorsmile @srowen
--
This is an automated messa
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292889034
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/config.scala:
##
@@ -462,6 +462,31 @@ package object config extends Logging {
.stringConf
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292888633
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/config.scala:
##
@@ -462,6 +462,31 @@ package object config extends Logging {
.stringConf
zhengruifeng commented on code in PR #42451:
URL: https://github.com/apache/spark/pull/42451#discussion_r1292888463
##
python/pyspark/sql/dataframe.py:
##
@@ -4066,6 +4078,9 @@ def dropDuplicatesWithinWatermark(self, subset:
Optional[List[str]] = None) -> "
.. versi
zhengruifeng commented on code in PR #42451:
URL: https://github.com/apache/spark/pull/42451#discussion_r1292888387
##
python/pyspark/sql/dataframe.py:
##
@@ -3540,6 +3546,9 @@ def melt(
.. versionadded:: 3.4.0
+.. versionchanged:: 3.4.0
Review Comment:
hvanhovell closed pull request #42473: [SPARK-44791][CONNECT] Make
ArrowDeserializer work with REPL generated classes
URL: https://github.com/apache/spark/pull/42473
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
UR
hvanhovell commented on PR #42473:
URL: https://github.com/apache/spark/pull/42473#issuecomment-1676515981
Merging to master/3.5
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comme
itholic commented on PR #42388:
URL: https://github.com/apache/spark/pull/42388#issuecomment-1676515460
I don't see any relevant log for the test failure from result as below:
```
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, u
shuwang21 commented on code in PR #42357:
URL: https://github.com/apache/spark/pull/42357#discussion_r1292885986
##
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala:
##
@@ -458,6 +461,52 @@ private[spark] class Client(
new Path(resolvedDestDir
github-actions[bot] closed pull request #40954: [PYSPARK] [CONNECT] [ML]
PySpark UDF supports python package dependencies
URL: https://github.com/apache/spark/pull/40954
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
github-actions[bot] closed pull request #41033: Update bufbuild plugin
references
URL: https://github.com/apache/spark/pull/41033
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment
hvanhovell closed pull request #42418: [SPARK-44736][CONNECT] Add
Dataset.explode to Spark Connect Scala Client
URL: https://github.com/apache/spark/pull/42418
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
hvanhovell commented on PR #42418:
URL: https://github.com/apache/spark/pull/42418#issuecomment-1676430570
Merging.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsub
bersprockets commented on PR #42075:
URL: https://github.com/apache/spark/pull/42075#issuecomment-1676417784
Super late review:
I think `boundGenerator` needs to be initialized somewhere around
[here](https://github.com/apache/spark/blob/7070b3672d8426834ff936fff4543b10093042fc/sql/co
43 matches
Mail list logo