panbingkun commented on code in PR #46301:
URL: https://github.com/apache/spark/pull/46301#discussion_r1587105724
##
common/utils/src/main/java/org/apache/spark/internal/Logger.java:
##
@@ -0,0 +1,224 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or more
panbingkun commented on PR #46288:
URL: https://github.com/apache/spark/pull/46288#issuecomment-2089669299
> Although we are waiting for `Ammonite` still, could you base this PR once
more, @panbingkun ?
>
> * [Add support for Scala 2.13.14
grundprinzip commented on code in PR #46297:
URL: https://github.com/apache/spark/pull/46297#discussion_r1587143920
##
dev/requirements.txt:
##
@@ -16,6 +16,7 @@ memory-profiler>=0.61.0
# PySpark test dependencies
unittest-xml-reporting
openpyxl
+parameterized
Review
rajatrj20 commented on PR #45350:
URL: https://github.com/apache/spark/pull/45350#issuecomment-2089760654
@cloud-fan This change broke an existing behaviour. When a aliased generator
field A is referenced in some another field B in project list, it will create a
situation where the B will
HyukjinKwon commented on code in PR #46297:
URL: https://github.com/apache/spark/pull/46297#discussion_r1587187092
##
python/pyspark/sql/tests/connect/client/test_client.py:
##
@@ -18,13 +18,15 @@
import unittest
import uuid
from collections.abc import Generator
-from typing
HyukjinKwon commented on PR #46298:
URL: https://github.com/apache/spark/pull/46298#issuecomment-2089909746
https://github.com/HyukjinKwon/spark/actions/runs/8920156555/job/24497640296
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
cloud-fan commented on PR #45350:
URL: https://github.com/apache/spark/pull/45350#issuecomment-2090237466
> SELECT col_1, EXPLODE(MAP_KEYS(map_str_col)) AS key, map_str_col[key] AS
value FROM nestedTable1;
I think this can be supported with LCA. cc @anchovYu
--
This is an
JoshRosen opened a new pull request, #46333:
URL: https://github.com/apache/spark/pull/46333
### What changes were proposed in this pull request?
While migrating the `NTile` expression's type check failures to the new
error class framework, PR
vladimirg-db commented on PR #46318:
URL: https://github.com/apache/spark/pull/46318#issuecomment-2089979391
Yes, @dongjoon-hyun, my intent was to add a small improvement after the
[SPARK-47939](https://issues.apache.org/jira/browse/SPARK-47939), which was
resolved by my PR recently.
HyukjinKwon opened a new pull request, #46334:
URL: https://github.com/apache/spark/pull/46334
### What changes were proposed in this pull request?
This PR proposes to skip the tests that fail with 3.5 client and 4.0 server
in Spark Connect (by adding
HyukjinKwon commented on PR #41946:
URL: https://github.com/apache/spark/pull/41946#issuecomment-2089747330
cc @WeichenXu123, @lu-wang-dl, @xinrong-meng, @rithwik-db, @maddiedawson,
mina following up this please?
--
This is an automated message from the Apache Git Service.
To respond to
HyukjinKwon commented on code in PR #46297:
URL: https://github.com/apache/spark/pull/46297#discussion_r1587187092
##
python/pyspark/sql/tests/connect/client/test_client.py:
##
@@ -18,13 +18,15 @@
import unittest
import uuid
from collections.abc import Generator
-from typing
HyukjinKwon closed pull request #46297: [SPARK-48056][CONNECT][PYTHON]
Re-execute plan if a SESSION_NOT_FOUND error is raised and no partial response
was received
URL: https://github.com/apache/spark/pull/46297
--
This is an automated message from the Apache Git Service.
To respond to the
nija-at commented on code in PR #46297:
URL: https://github.com/apache/spark/pull/46297#discussion_r1587297283
##
dev/requirements.txt:
##
@@ -16,6 +16,7 @@ memory-profiler>=0.61.0
# PySpark test dependencies
unittest-xml-reporting
openpyxl
+parameterized
Review Comment:
HyukjinKwon commented on PR #46297:
URL: https://github.com/apache/spark/pull/46297#issuecomment-2090362203
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
panbingkun commented on PR #46288:
URL: https://github.com/apache/spark/pull/46288#issuecomment-2089952489
>
https://repo1.maven.org/maven2/com/typesafe/genjavadoc/genjavadoc-plugin_2.13.14/0.19/
I have updated the version of `genjavadoc` in the file
`project/SparkBuild.scala`.
--
HyukjinKwon commented on code in PR #46334:
URL: https://github.com/apache/spark/pull/46334#discussion_r1587434003
##
python/pyspark/ml/tests/connect/test_connect_tuning.py:
##
@@ -15,16 +15,17 @@
# See the License for the specific language governing permissions and
#
HyukjinKwon commented on PR #43888:
URL: https://github.com/apache/spark/pull/43888#issuecomment-2089920144
Let me backport this to branch-3.5 as well.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
gengliangwang commented on code in PR #46301:
URL: https://github.com/apache/spark/pull/46301#discussion_r1587185157
##
common/utils/src/main/java/org/apache/spark/internal/Logger.java:
##
@@ -0,0 +1,224 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or
bluzy commented on PR #46025:
URL: https://github.com/apache/spark/pull/46025#issuecomment-2089855493
PTAL @dongjoon-hyun @mridulm
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46333:
URL: https://github.com/apache/spark/pull/46333#issuecomment-2090622363
cc @LuciferYang and @MaxGekk from
- https://github.com/apache/spark/pull/38457
--
This is an automated message from the Apache Git Service.
To respond to the message, please
dongjoon-hyun closed pull request #46333: [SPARK-48081] Fix ClassCastException
in NTile.checkInputDataTypes() when argument is non-foldable or of wrong type
URL: https://github.com/apache/spark/pull/46333
--
This is an automated message from the Apache Git Service.
To respond to the message,
dongjoon-hyun closed pull request #46330: [SPARK-48079][BUILD] Upgrade
maven-install/deploy-plugin to 3.1.2
URL: https://github.com/apache/spark/pull/46330
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on code in PR #46288:
URL: https://github.com/apache/spark/pull/46288#discussion_r1587813517
##
connector/connect/client/jvm/pom.xml:
##
@@ -73,7 +73,7 @@
com.lihaoyi
- ammonite_${scala.version}
+ ammonite_2.13.13
Review
dongjoon-hyun commented on PR #46333:
URL: https://github.com/apache/spark/pull/46333#issuecomment-2090847049
Oh, there was a test failure in 3.5/3.4.
Could you make a backporting PR to branch-3.5 and branch-3.4, @JoshRosen ?
--
This is an automated message from the Apache Git
dongjoon-hyun closed pull request #46318: [SPARK-48072][SQL][TESTS] Improve
SQLQuerySuite test output - use `===` instead of `sameElements` for Arrays
URL: https://github.com/apache/spark/pull/46318
--
This is an automated message from the Apache Git Service.
To respond to the message,
dongjoon-hyun commented on PR #46334:
URL: https://github.com/apache/spark/pull/46334#issuecomment-2090614620
cc @grundprinzip , too
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46288:
URL: https://github.com/apache/spark/pull/46288#issuecomment-2090615461
Thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
dongjoon-hyun commented on PR #43888:
URL: https://github.com/apache/spark/pull/43888#issuecomment-2090629597
Thank you, @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
panbingkun commented on code in PR #46288:
URL: https://github.com/apache/spark/pull/46288#discussion_r1587786326
##
connector/connect/client/jvm/pom.xml:
##
@@ -73,7 +73,7 @@
com.lihaoyi
- ammonite_${scala.version}
+ ammonite_2.13.13
Review
dongjoon-hyun commented on PR #43888:
URL: https://github.com/apache/spark/pull/43888#issuecomment-2090851277
I also cherry-picked it to branch-3.4, too
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
panbingkun opened a new pull request, #46335:
URL: https://github.com/apache/spark/pull/46335
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
### How
dongjoon-hyun commented on PR #46333:
URL: https://github.com/apache/spark/pull/46333#issuecomment-2090626063
Merged to master/3.5/3.4 for Apache Spark 4.0.0-preview and 3.5.2 and 3.4.4.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
panbingkun commented on PR #46301:
URL: https://github.com/apache/spark/pull/46301#issuecomment-2090643728
@gengliangwang
This pr is ready for review.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
panbingkun commented on PR #46288:
URL: https://github.com/apache/spark/pull/46288#issuecomment-2090712115
> jline
I guess it may be related to the fact that the `Ammonite` supporting `scala
2.13.14` has not been released.
--
This is an automated message from the Apache Git
WweiL opened a new pull request, #46339:
URL: https://github.com/apache/spark/pull/46339
### What changes were proposed in this pull request?
Backport Client Side StreamingQueryListener to 3.5
### Why are the changes needed?
To pass cross-version test
###
WweiL opened a new pull request, #46341:
URL: https://github.com/apache/spark/pull/46341
### What changes were proposed in this pull request?
We are backporting the client side StreamingQueryListenr to branch-3.5. In
case there is usage of server side listener and users want
dongjoon-hyun closed pull request #46341: [DO-NOT-REVIEW]
[SPARK-48093][SS][CONNECT][3.5] Add config to switch between client side and
server side StreamingQueryListener
URL: https://github.com/apache/spark/pull/46341
--
This is an automated message from the Apache Git Service.
To respond
dongjoon-hyun commented on PR #46341:
URL: https://github.com/apache/spark/pull/46341#issuecomment-2091452810
Since this is not for review, I'd recommend to open this PR to your
repository. Your GitHub Action can verify your PR, @WweiL .
--
This is an automated message from the Apache
WweiL commented on PR #46339:
URL: https://github.com/apache/spark/pull/46339#issuecomment-2091452004
@dongjoon-hyun Thanks for the comment! May I know if it is acceptable to
merge this with this added config? https://github.com/apache/spark/pull/46341
We can allow users to switch between
dongjoon-hyun opened a new pull request, #46342:
URL: https://github.com/apache/spark/pull/46342
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
gene-db commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1588311315
##
python/pyspark/sql/variant_utils.py:
##
@@ -245,47 +245,57 @@ def _get_string(cls, value: bytes, pos: int) -> str:
length = cls._read_long(value,
WweiL opened a new pull request, #46340:
URL: https://github.com/apache/spark/pull/46340
### What changes were proposed in this pull request?
We are backporting the client side StreamingQueryListenr to branch-3.5. In
case there is usage of server side listener and users want
chenhao-db commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1588306639
##
common/variant/src/main/java/org/apache/spark/types/variant/VariantUtil.java:
##
@@ -392,21 +392,32 @@ public static double getDouble(byte[] value, int pos) {
dongjoon-hyun commented on PR #46342:
URL: https://github.com/apache/spark/pull/46342#issuecomment-2091492692
Could you review this PR, @huaxingao ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
gene-db commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1588226317
##
common/variant/src/main/java/org/apache/spark/types/variant/VariantUtil.java:
##
@@ -392,21 +392,32 @@ public static double getDouble(byte[] value, int pos) {
gengliangwang commented on PR #46312:
URL: https://github.com/apache/spark/pull/46312#issuecomment-2091365435
Thanks, merging to master
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
gengliangwang closed pull request #46312: [SPARK-48067][SQL] Fix variant
default columns
URL: https://github.com/apache/spark/pull/46312
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
chenhao-db commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1588252744
##
common/variant/src/main/java/org/apache/spark/types/variant/VariantUtil.java:
##
@@ -392,21 +392,32 @@ public static double getDouble(byte[] value, int pos) {
JoshRosen opened a new pull request, #46336:
URL: https://github.com/apache/spark/pull/46336
branch-3.5 pick of PR https://github.com/apache/spark/pull/46333 , fixing
test issue due to difference in expected error message parameter formatting
across branches; original description follows
romibuzi commented on PR #34158:
URL: https://github.com/apache/spark/pull/34158#issuecomment-2091427975
I have filled a JIRA prior finding this PR
https://issues.apache.org/jira/browse/SPARK-48043. I'm facing a similar issue
with the call on `Utils.isPushBasedShuffleEnabled` in
JoshRosen opened a new pull request, #46337:
URL: https://github.com/apache/spark/pull/46337
branch-3.4 pick of PR https://github.com/apache/spark/pull/46333 , fixing
test issue due to difference in expected error message parameter formatting
across branches; original description follows
JoshRosen commented on PR #46333:
URL: https://github.com/apache/spark/pull/46333#issuecomment-2091029447
Backport PRs:
- 3.5: https://github.com/apache/spark/pull/46336
- 3.4: https://github.com/apache/spark/pull/46337
--
This is an automated message from the Apache Git
dongjoon-hyun closed pull request #46336: [SPARK-48081][SQL][3.5] Fix
ClassCastException in NTile.checkInputDataTypes() when argument is non-foldable
or of wrong type
URL: https://github.com/apache/spark/pull/46336
--
This is an automated message from the Apache Git Service.
To respond to
dongjoon-hyun commented on PR #46336:
URL: https://github.com/apache/spark/pull/46336#issuecomment-2091437784
Merged to branch-3.5.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46337:
URL: https://github.com/apache/spark/pull/46337#issuecomment-2091448715
Merged to branch-3.4.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun closed pull request #46337: [SPARK-48081][SQL][3.4] Fix
ClassCastException in NTile.checkInputDataTypes() when argument is non-foldable
or of wrong type
URL: https://github.com/apache/spark/pull/46337
--
This is an automated message from the Apache Git Service.
To respond to
dongjoon-hyun closed pull request #46339: [SPARK-48089][SS][CONNECT][3.5]
Backport Client Side StreamingQueryListener to 3.5
URL: https://github.com/apache/spark/pull/46339
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
dongjoon-hyun commented on PR #46339:
URL: https://github.com/apache/spark/pull/46339#issuecomment-2091447286
Let me close this to prevent accidental merging. We can continue to discuss
on this PR after closing.
--
This is an automated message from the Apache Git Service.
To respond to
chenhao-db commented on code in PR #46338:
URL: https://github.com/apache/spark/pull/46338#discussion_r1588326114
##
python/pyspark/sql/variant_utils.py:
##
@@ -245,47 +245,57 @@ def _get_string(cls, value: bytes, pos: int) -> str:
length =
dtenedor commented on PR #46309:
URL: https://github.com/apache/spark/pull/46309#issuecomment-2091585054
@gengliangwang thanks for your review, responded to comments, please look
again.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on
dongjoon-hyun commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588493621
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
sunchao closed pull request #46325: [SPARK-48065][SQL] SPJ:
allowJoinKeysSubsetOfPartitionKeys is too strict
URL: https://github.com/apache/spark/pull/46325
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
szehon-ho commented on PR #46325:
URL: https://github.com/apache/spark/pull/46325#issuecomment-2091913189
Thanks for fast review! Yea will do that.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dongjoon-hyun commented on PR #46347:
URL: https://github.com/apache/spark/pull/46347#issuecomment-2091940855
Could you review this PR, @HyukjinKwon ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
github-actions[bot] commented on PR #44829:
URL: https://github.com/apache/spark/pull/44829#issuecomment-2091944038
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #44853:
URL: https://github.com/apache/spark/pull/44853#issuecomment-2091944028
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #40782:
URL: https://github.com/apache/spark/pull/40782#issuecomment-2091944074
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
github-actions[bot] commented on PR #44725:
URL: https://github.com/apache/spark/pull/44725#issuecomment-2091944051
We're closing this PR because it hasn't been updated in a while. This isn't
a judgement on the merit of the PR in any way. It's just a way of keeping the
PR queue manageable.
HyukjinKwon commented on PR #46334:
URL: https://github.com/apache/spark/pull/46334#issuecomment-2091994700
`Spark Connect` is actually for testing pure python library. Spark 3.5
doesn't have it ... from 4.0 <> 4.1, we could leverage pure python library
build to test them.
I am
dongjoon-hyun commented on PR #46343:
URL: https://github.com/apache/spark/pull/46343#issuecomment-2091612469
Thank you, @viirya !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46346:
URL: https://github.com/apache/spark/pull/46346#issuecomment-2091823433
Thank you so much, @gengliangwang .
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
dongjoon-hyun closed pull request #46344: [SPARK-48097][INFRA] Limit GHA job
execution time to up to 3 hours in `build_and_test.yml`
URL: https://github.com/apache/spark/pull/46344
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
dongjoon-hyun commented on PR #46344:
URL: https://github.com/apache/spark/pull/46344#issuecomment-2091828537
Thank you, @HyukjinKwon !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
WweiL closed pull request #46340: [DO-NOT-REVIEW]
[SPARK-48093][SS][CONNECT][4.0] Add server side config handler for 3.5 client
requesting Server side StreamingQueryListener
URL: https://github.com/apache/spark/pull/46340
--
This is an automated message from the Apache Git Service.
To
sunchao commented on PR #46325:
URL: https://github.com/apache/spark/pull/46325#issuecomment-2091895489
Merged to master, thanks @szehon-ho ! Do you think we need to backport this
to branch-3.4 and branch-3.5?
--
This is an automated message from the Apache Git Service.
To respond to the
dongjoon-hyun commented on code in PR #46349:
URL: https://github.com/apache/spark/pull/46349#discussion_r1588525182
##
docs/configuration.md:
##
@@ -3670,14 +3670,17 @@ Note: When running Spark on YARN in `cluster` mode,
environment variables need t
# Configuring Logging
gengliangwang commented on PR #46349:
URL: https://github.com/apache/spark/pull/46349#issuecomment-2091909130
@dongjoon-hyun I double checked and removed some comments. PTAL, thanks!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to
dongjoon-hyun opened a new pull request, #46343:
URL: https://github.com/apache/spark/pull/46343
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
###
dongjoon-hyun opened a new pull request, #46344:
URL: https://github.com/apache/spark/pull/46344
…
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
dongjoon-hyun closed pull request #46343: [SPARK-48096][INFRA] Run
`build_maven_java21_macos14.yml` every two days
URL: https://github.com/apache/spark/pull/46343
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
WweiL opened a new pull request, #46345:
URL: https://github.com/apache/spark/pull/46345
### What changes were proposed in this pull request?
An extra assignment was added when we first introduce
`dropDuplicatesWithinWatermark` in
dongjoon-hyun opened a new pull request, #46346:
URL: https://github.com/apache/spark/pull/46346
…
### What changes were proposed in this pull request?
### Why are the changes needed?
### Does this PR introduce _any_ user-facing change?
dongjoon-hyun commented on PR #46346:
URL: https://github.com/apache/spark/pull/46346#issuecomment-2091830231
Oh, sorry. I need to revise this PR.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
szehon-ho commented on PR #46325:
URL: https://github.com/apache/spark/pull/46325#issuecomment-2091841349
@sunchao I think its a simple fix, can you take a look?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
gengliangwang commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588499107
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
dongjoon-hyun commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588499484
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
gengliangwang commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588500715
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
dongjoon-hyun commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588500029
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
dongjoon-hyun commented on PR #46346:
URL: https://github.com/apache/spark/pull/46346#issuecomment-2091864830
Thank you for review and approval.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
gengliangwang commented on PR #46349:
URL: https://github.com/apache/spark/pull/46349#issuecomment-2091886078
cc @dtenedor @panbingkun as well.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
gengliangwang opened a new pull request, #46349:
URL: https://github.com/apache/spark/pull/46349
### What changes were proposed in this pull request?
- Rename the current log4j2.properties.template as
log4j2.properties.pattern-layout-template
- Enable structured
dongjoon-hyun commented on code in PR #46346:
URL: https://github.com/apache/spark/pull/46346#discussion_r1588493621
##
project/SparkBuild.scala:
##
@@ -257,7 +257,7 @@ object SparkBuild extends PomBuild {
lazy val sharedSettings = sparkGenjavadocSettings ++
anishshri-db opened a new pull request, #46350:
URL: https://github.com/apache/spark/pull/46350
### What changes were proposed in this pull request?
Track duration for acquiring source/sink metrics while reporting streaming
query progress
### Why are the changes needed?
HyukjinKwon closed pull request #46287: [SPARK-48048][CONNECT][SS] Added client
side listener support for Scala
URL: https://github.com/apache/spark/pull/46287
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
HyukjinKwon commented on PR #46287:
URL: https://github.com/apache/spark/pull/46287#issuecomment-2091982060
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
gengliangwang commented on code in PR #46301:
URL: https://github.com/apache/spark/pull/46301#discussion_r1588589564
##
common/utils/src/main/java/org/apache/spark/internal/Logger.java:
##
@@ -0,0 +1,184 @@
+/*
+ * Licensed to the Apache Software Foundation (ASF) under one or
dongjoon-hyun commented on PR #46343:
URL: https://github.com/apache/spark/pull/46343#issuecomment-2091539703
Could you review this PR, @viirya ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
dongjoon-hyun closed pull request #46342: [SPARK-48095][INFRA] Run
`build_non_ansi.yml` once per day
URL: https://github.com/apache/spark/pull/46342
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
huaxingao commented on PR #46342:
URL: https://github.com/apache/spark/pull/46342#issuecomment-2091567244
LGTM. Thanks @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
1 - 100 of 144 matches
Mail list logo