dongjoon-hyun commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-2088785354
For the record, I refocused this PR to `--python-executable` issue only in
this PR. Python linter passed without any issues.
![Screenshot 2024-05-01 at 10 11
dongjoon-hyun commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-2088824377
Thank you, @huaxingao !
Merged to master/3.5/3.4.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun closed pull request #46314: [SPARK-48068][PYTHON] `mypy` should
have `--python-executable` parameter
URL: https://github.com/apache/spark/pull/46314
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
dtenedor commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586662927
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
gengliangwang commented on code in PR #46309:
URL: https://github.com/apache/spark/pull/46309#discussion_r1586662318
##
core/src/main/scala/org/apache/spark/deploy/master/Master.scala:
##
@@ -1294,7 +1293,7 @@ private[deploy] class Master(
if (worker.state !=
fanyue-xia opened a new pull request, #46324:
URL: https://github.com/apache/spark/pull/46324
### What changes were proposed in this pull request?
Add type checking for `to_avro` and `from_avro` for PySpark.
### Why are the changes needed?
If we perform type checking for
dongjoon-hyun closed pull request #46317: [SPARK-48071][INFRA][PYTHON] Use
Python 3.10 in `Python Linter` step
URL: https://github.com/apache/spark/pull/46317
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
gengliangwang commented on PR #46322:
URL: https://github.com/apache/spark/pull/46322#issuecomment-2088917364
cc @panbingkun @dtenedor as well
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun commented on PR #46323:
URL: https://github.com/apache/spark/pull/46323#issuecomment-2088974804
Let's make it sure that all tests pass on this PR builder.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use
gengliangwang commented on code in PR #46322:
URL: https://github.com/apache/spark/pull/46322#discussion_r1586724715
##
common/utils/src/test/scala/org/apache/spark/util/StructuredLoggingSuite.scala:
##
@@ -192,8 +192,8 @@ class StructuredLoggingSuite extends LoggingSuiteBase {
dongjoon-hyun commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-2088982611
Thank you, @viirya !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
gengliangwang commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586770080
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
dongjoon-hyun commented on PR #46272:
URL: https://github.com/apache/spark/pull/46272#issuecomment-204219
Thank you, @beliefer .
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
dongjoon-hyun closed pull request #46272: [SPARK-46009][SQL][FOLLOWUP] Remove
unused PERCENTILE_CONT and PERCENTILE_DISC in g4
URL: https://github.com/apache/spark/pull/46272
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
gengliangwang opened a new pull request, #46323:
URL: https://github.com/apache/spark/pull/46323
### What changes were proposed in this pull request?
Currently, the following query will throw DIVIDE_BY_ZERO error instead of
returning null
```
SELECT
dtenedor commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586664236
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586783457
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
nchammas opened a new pull request, #46321:
URL: https://github.com/apache/spark/pull/46321
### What changes were proposed in this pull request?
Minor fixes to the English of some comments I added in #44920.
### Why are the changes needed?
Proper English -- OK, not
dongjoon-hyun commented on PR #46288:
URL: https://github.com/apache/spark/pull/46288#issuecomment-2088823817
Thank you so much, @SethTisue !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
gengliangwang commented on PR #46323:
URL: https://github.com/apache/spark/pull/46323#issuecomment-2088905069
@dongjoon-hyun Thanks for pointing out the mistaken changes! It is now
reverted in
https://github.com/apache/spark/pull/46323/commits/0a43e3ab982dcb21bfaf9594db0233a354f12869
--
dongjoon-hyun commented on code in PR #46322:
URL: https://github.com/apache/spark/pull/46322#discussion_r1586720608
##
common/utils/src/test/scala/org/apache/spark/util/StructuredLoggingSuite.scala:
##
@@ -192,8 +192,8 @@ class StructuredLoggingSuite extends LoggingSuiteBase {
gengliangwang commented on code in PR #46322:
URL: https://github.com/apache/spark/pull/46322#discussion_r1586740612
##
common/utils/src/test/scala/org/apache/spark/util/StructuredLoggingSuite.scala:
##
@@ -192,8 +192,8 @@ class StructuredLoggingSuite extends LoggingSuiteBase {
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586782740
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586783457
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
gengliangwang commented on code in PR #46309:
URL: https://github.com/apache/spark/pull/46309#discussion_r1586810024
##
core/src/main/scala/org/apache/spark/metrics/sink/StatsdReporter.scala:
##
@@ -67,7 +67,8 @@ private[spark] class StatsdReporter(
Try(new DatagramSocket)
gengliangwang commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586809054
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
dongjoon-hyun commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-2088817665
All linters passed.
![Screenshot 2024-05-01 at 10 35
56](https://github.com/apache/spark/assets/9700541/c62cc30e-7f3a-4e4e-a871-c37fa11ab051)
Could you review this PR,
gengliangwang opened a new pull request, #46322:
URL: https://github.com/apache/spark/pull/46322
### What changes were proposed in this pull request?
Improve the readability of JSON loggings via:
1. Use UTC in the timestamp so that the timestamp field value is more
dongjoon-hyun commented on code in PR #46318:
URL: https://github.com/apache/spark/pull/46318#discussion_r1586626193
##
sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala:
##
@@ -4750,12 +4750,13 @@ class SQLQuerySuite extends QueryTest with
SharedSparkSession
gengliangwang commented on code in PR #46309:
URL: https://github.com/apache/spark/pull/46309#discussion_r1586820408
##
core/src/main/scala/org/apache/spark/shuffle/sort/SortShuffleManager.scala:
##
@@ -76,7 +76,8 @@ private[spark] class SortShuffleManager(conf: SparkConf)
PaysonXu opened a new pull request, #46320:
URL: https://github.com/apache/spark/pull/46320
…]: 44 to internal err, 45 to DEFAULT_UNSUPPORTED, 46 to
ADD_DEFAULT_UNSUPPORTED
### What changes were proposed in this pull request?
rename err class
dongjoon-hyun commented on PR #46286:
URL: https://github.com/apache/spark/pull/46286#issuecomment-2088873791
Let me revert `branch-3.5` commit first to recover the CI. Could you make a
PR to branch-3.5, @gengliangwang ?
--
This is an automated message from the Apache Git Service.
To
viirya commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-204973
Looks good to me. Thanks @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to
gengliangwang commented on PR #46286:
URL: https://github.com/apache/spark/pull/46286#issuecomment-205469
@dongjoon-hyun Thanks a lot. I will create a backport for 3.5.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586697875
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586697875
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
dongjoon-hyun closed pull request #46316: [SPARK-48070][SQL][TESTS] Support
`AdaptiveQueryExecSuite.runAdaptiveAndVerifyResult` to skip check results
URL: https://github.com/apache/spark/pull/46316
--
This is an automated message from the Apache Git Service.
To respond to the message, please
dongjoon-hyun commented on PR #46316:
URL: https://github.com/apache/spark/pull/46316#issuecomment-2088992149
Merged to master for Apache Spark 4.0.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586798259
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
xupefei commented on PR #45701:
URL: https://github.com/apache/spark/pull/45701#issuecomment-2088626203
> > @xupefei there is a genuine test failure. Can you check what is going on?
>
> It seems the test is flaky, even after the previous attempt to fix it:
#45173
I re-ran the
dtenedor commented on PR #46309:
URL: https://github.com/apache/spark/pull/46309#issuecomment-2088811850
@gengliangwang thanks for a thorough review. I followed your instructions
for every comment, and then just resolved them all to clean up the GitHub
conversation history page. Please
dongjoon-hyun commented on PR #46286:
URL: https://github.com/apache/spark/pull/46286#issuecomment-2088871262
Hi, @gengliangwang .
It's a little weird because
- This PR doesn't contain `sql/core/src/test/resources/log4j2.properties`.
- master commit also does.
- branch-3.5
gengliangwang commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586788998
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586798259
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586798259
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586798259
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
richardc-db commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586697875
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/util/ResolveDefaultColumnsUtil.scala:
##
@@ -84,9 +84,16 @@ object ResolveDefaultColumns extends
gengliangwang commented on PR #46309:
URL: https://github.com/apache/spark/pull/46309#issuecomment-2089082622
@dtenedor the test failure looks relevant
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
xieshuaihu commented on PR #46278:
URL: https://github.com/apache/spark/pull/46278#issuecomment-2088566513
@hvanhovell @HyukjinKwon
Thers are two reasons to support set scheduler pool in spark connect.
1. Vanilla spark supports fair scheduler and pools, if server runs in a
srielau commented on PR #46267:
URL: https://github.com/apache/spark/pull/46267#issuecomment-2088685593
@cloud-fan @gengliangwang This is ready for an initial review.
Other than docs, the main open issue is NOT NULL behavior and COMMENTs.
--
This is an automated message from
dongjoon-hyun commented on PR #46314:
URL: https://github.com/apache/spark/pull/46314#issuecomment-2088820775
Could you review this PR, @huaxingao ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
dtenedor commented on PR #46312:
URL: https://github.com/apache/spark/pull/46312#issuecomment-2088934443
```
previously we could not set a variant default column like
create table t(
v6 variant default parse_json('{\"k\": \"v\"}')
)
```
@richardc-db this
szehon-ho opened a new pull request, #46325:
URL: https://github.com/apache/spark/pull/46325
### What changes were proposed in this pull request?
If spark.sql.v2.bucketing.allowJoinKeysSubsetOfPartitionKeys.enabled is
true, change KeyGroupedPartitioning.satisfies0(distribution) check
dongjoon-hyun commented on PR #46157:
URL: https://github.com/apache/spark/pull/46157#issuecomment-2089225792
Welcome to the Apache Spark community, @huangzhir .
I added you to the Apache Spark contributor group and assigned SPARK-47934
to you.
Congratulations for your first commit!
viirya commented on code in PR #46273:
URL: https://github.com/apache/spark/pull/46273#discussion_r1586884528
##
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:
##
@@ -2410,6 +2413,26 @@ class AdaptiveQueryExecSuite
}
}
+
viirya commented on code in PR #46273:
URL: https://github.com/apache/spark/pull/46273#discussion_r1586883895
##
sql/core/src/test/scala/org/apache/spark/sql/execution/adaptive/AdaptiveQueryExecSuite.scala:
##
@@ -2410,6 +2413,26 @@ class AdaptiveQueryExecSuite
}
}
+
bogao007 commented on code in PR #46287:
URL: https://github.com/apache/spark/pull/46287#discussion_r1586884029
##
connector/connect/client/jvm/src/main/scala/org/apache/spark/sql/streaming/StreamingQueryListenerBus.scala:
##
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache
dongjoon-hyun commented on PR #46327:
URL: https://github.com/apache/spark/pull/46327#issuecomment-2089349241
Could you review this PR, @HyukjinKwon ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
nchammas commented on code in PR #46328:
URL: https://github.com/apache/spark/pull/46328#discussion_r1586954148
##
python/packaging/classic/setup.py:
##
@@ -307,6 +307,7 @@ def run(self):
"pyspark.errors",
"pyspark.errors.exceptions",
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586968858
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java:
##
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the
HyukjinKwon commented on code in PR #46328:
URL: https://github.com/apache/spark/pull/46328#discussion_r1586976770
##
python/packaging/classic/setup.py:
##
@@ -307,6 +307,7 @@ def run(self):
"pyspark.errors",
"pyspark.errors.exceptions",
yaooqinn commented on PR #46133:
URL: https://github.com/apache/spark/pull/46133#issuecomment-2089391772
Thank you @dongjoon-hyun
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46329:
URL: https://github.com/apache/spark/pull/46329#issuecomment-2089425612
Thank you, @HyukjinKwon !
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
dongjoon-hyun closed pull request #46329: [SPARK-48078][K8S] Promote
`o.a.s.d.k8s.Constants` to `DeveloperApi`
URL: https://github.com/apache/spark/pull/46329
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
panbingkun commented on PR #46301:
URL: https://github.com/apache/spark/pull/46301#issuecomment-2089426127
> @panbingkun I see. There are about 88 loggings with variables
>
> ```
> find . -name "*.java"|xargs grep -i
"logger.info\|logger.warn\|logger.error"|grep "{}"| grep -v
dongjoon-hyun commented on PR #46329:
URL: https://github.com/apache/spark/pull/46329#issuecomment-2089570488
Thank you, @viirya !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on PR #46324:
URL: https://github.com/apache/spark/pull/46324#issuecomment-2089620300
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #46324: [SPARK-48075] [SS] Add type checking
for PySpark avro functions
URL: https://github.com/apache/spark/pull/46324
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
gengliangwang commented on code in PR #46312:
URL: https://github.com/apache/spark/pull/46312#discussion_r1586864884
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/literals.scala:
##
@@ -549,6 +549,7 @@ case class Literal (value: Any, dataType:
dongjoon-hyun closed pull request #46323: [SPARK-48016][SQL][3.5] Fix a bug in
try_divide function when with decimals
URL: https://github.com/apache/spark/pull/46323
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
dongjoon-hyun commented on PR #46323:
URL: https://github.com/apache/spark/pull/46323#issuecomment-2089200222
Thank you!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
HyukjinKwon opened a new pull request, #46328:
URL: https://github.com/apache/spark/pull/46328
### What changes were proposed in this pull request?
This PR is a followup of https://github.com/apache/spark/pull/44920 that
includes `error-conditions.json` into PyPI package.
###
HyukjinKwon commented on PR #46328:
URL: https://github.com/apache/spark/pull/46328#issuecomment-2089346079
cc @nchammas FYI
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
HyukjinKwon commented on PR #46328:
URL: https://github.com/apache/spark/pull/46328#issuecomment-2089346151
and @itholic too
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
dongjoon-hyun commented on PR #46326:
URL: https://github.com/apache/spark/pull/46326#issuecomment-2089359042
Thank you, @viirya !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun commented on PR #46327:
URL: https://github.com/apache/spark/pull/46327#issuecomment-2089358862
Thank you, @viirya !
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon commented on code in PR #46328:
URL: https://github.com/apache/spark/pull/46328#discussion_r1586971280
##
python/packaging/classic/setup.py:
##
@@ -307,6 +307,7 @@ def run(self):
"pyspark.errors",
"pyspark.errors.exceptions",
HyukjinKwon commented on PR #46328:
URL: https://github.com/apache/spark/pull/46328#issuecomment-2089381467
There is a test at
https://github.com/apache/spark/blob/master/dev/pip-sanity-check.py but that
only runs a basic test that does not import pyspark.errors whereas
HyukjinKwon commented on PR #46328:
URL: https://github.com/apache/spark/pull/46328#issuecomment-2089397703
Merged to master.
It will be tested in the scheduled job.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
HyukjinKwon commented on code in PR #44920:
URL: https://github.com/apache/spark/pull/44920#discussion_r1586979834
##
python/MANIFEST.in:
##
@@ -14,13 +14,18 @@
# See the License for the specific language governing permissions and
# limitations under the License.
HyukjinKwon commented on PR #46310:
URL: https://github.com/apache/spark/pull/46310#issuecomment-2089462468
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
HyukjinKwon closed pull request #46310: [SPARK-48064][SQL] Update error
messages for routine related error classes
URL: https://github.com/apache/spark/pull/46310
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL
cxzl25 commented on code in PR #46273:
URL: https://github.com/apache/spark/pull/46273#discussion_r1587027095
##
core/src/main/scala/org/apache/spark/util/collection/ExternalSorter.scala:
##
@@ -710,7 +711,7 @@ private[spark] class ExternalSorter[K, V, C](
dongjoon-hyun closed pull request #46332: [SPARK-48080][K8S] Promote
`*MainAppResource` and `NonJVMResource` to `DeveloperApi`
URL: https://github.com/apache/spark/pull/46332
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and
HyukjinKwon commented on PR #46298:
URL: https://github.com/apache/spark/pull/46298#issuecomment-2089586686
Doctest:
```
File
"/home/runner/work/spark/spark-35/python/pyspark/sql/connect/dataframe.py",
line 1057, in pyspark.sql.connect.dataframe.DataFrame.union
Failed example:
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586903355
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppDriverConf.java:
##
@@ -0,0 +1,66 @@
+/*
+ * Licensed to the Apache
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586905774
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppDriverConf.java:
##
@@ -0,0 +1,66 @@
+/*
+ * Licensed to the Apache
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586909100
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java:
##
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the
HyukjinKwon commented on code in PR #46319:
URL: https://github.com/apache/spark/pull/46319#discussion_r1586927903
##
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/analysis/unresolved.scala:
##
@@ -98,7 +98,7 @@ case class ExpressionWithUnresolvedIdentifier(
/**
HyukjinKwon closed pull request #46321: [MINOR] Fix the grammar of some
comments on renaming error classes
URL: https://github.com/apache/spark/pull/46321
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
HyukjinKwon commented on PR #46321:
URL: https://github.com/apache/spark/pull/46321#issuecomment-2089300250
Merged to master.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific
dongjoon-hyun closed pull request #46327: [SPARK-48077][K8S] Promote
`KubernetesClientUtils` to `DeveloperApi`
URL: https://github.com/apache/spark/pull/46327
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above
dongjoon-hyun commented on PR #46327:
URL: https://github.com/apache/spark/pull/46327#issuecomment-2089367196
Merged to master for Apache Spark 4.0.0.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on PR #46332:
URL: https://github.com/apache/spark/pull/46332#issuecomment-2089464647
Could you review this too, @HyukjinKwon ?
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to
dongjoon-hyun commented on PR #46332:
URL: https://github.com/apache/spark/pull/46332#issuecomment-2089465359
Thank you!
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To
gengliangwang commented on PR #46323:
URL: https://github.com/apache/spark/pull/46323#issuecomment-2089172224
CI passed. Merging to branch-3.5
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586901378
##
gradle.properties:
##
@@ -18,17 +18,23 @@
group=org.apache.spark.k8s.operator
version=0.1.0
-fabric8Version=6.12.1
+# Caution: fabric8
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586907715
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java:
##
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586908280
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java:
##
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the
dongjoon-hyun commented on code in PR #10:
URL:
https://github.com/apache/spark-kubernetes-operator/pull/10#discussion_r1586908000
##
spark-submission-worker/src/main/java/org/apache/spark/k8s/operator/SparkAppResourceSpec.java:
##
@@ -0,0 +1,127 @@
+/*
+ * Licensed to the
1 - 100 of 185 matches
Mail list logo