Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19649#discussion_r149315115
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/events.scala
---
@@ -110,7 +120,31 @@ case class RenameTableEvent
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149017279
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -705,6 +705,19 @@ private[spark] class Client
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19663
Please also add [YARN] tag to the PR title, this is actually a yarn problem.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19663#discussion_r149015858
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala
---
@@ -705,6 +705,19 @@ private[spark] class Client
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18791
@ericvandenbergfb please also fix the PR title, thanks.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19649#discussion_r149009631
--- Diff:
sql/catalyst/src/test/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalogEventSuite.scala
---
@@ -104,6 +109,8 @@ class
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19649#discussion_r149004933
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
---
@@ -158,7 +173,13 @@ abstract class
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19649#discussion_r149003803
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/catalog/ExternalCatalog.scala
---
@@ -147,7 +154,15 @@ abstract class
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/19649
[SPARK-22405][SQL] Add more ExternalCatalogEvent
## What changes were proposed in this pull request?
We're building a data lineage tool in which we need to monitor the metadata
changes
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19586
I tend to agree with @cloud-fan , I think you can implement your own
serializer out of Spark to be more specialized for your application, that will
definitely be more efficient than the built
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19396
Sorry I didn't notice it, will double-check next time.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19396
OK, let me merge to master branch.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19586
Using configurations seems not so elegant, also configuration is
application based, how would you turn off/on this feature in the runtime? Sorry
I cannot give you a good advice, maybe kryo's
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19396
I'm OK with the current changes.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19586
@ConeyLiu what about the below example, does your implementation support
this?
```scala
trait Base { val name: String }
case class A(name: String) extends Base
case class
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19580
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19580
jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19580#discussion_r147325260
--- Diff:
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
@@ -267,6 +267,10 @@ private[spark] class ExecutorAllocationManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19580#discussion_r147304200
--- Diff:
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
@@ -267,6 +267,10 @@ private[spark] class ExecutorAllocationManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19580#discussion_r147303973
--- Diff:
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
@@ -678,7 +679,9 @@ private[spark] class ExecutorAllocationManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19580#discussion_r147304306
--- Diff:
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
@@ -709,7 +712,9 @@ private[spark] class ExecutorAllocationManager
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19519
LGTM, merging to master.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19534
@sitalkedia would you please reopen this PR, I think the second issue I
fixed before is not valid anymore, for the first issue the fix is no difference
compared to here
Github user jerryshao closed the pull request at:
https://github.com/apache/spark/pull/11205
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/11205
Verified again, looks like the 2nd bullet is not valid anymore, I cannot
reproduce it in latest master branch, this might have already been fixed in
SPARK-13054.
So only first issue
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/11205
@vanzin , in the current code `stageIdToTaskIndices` cannot be used to
track number of running tasks, because this structure doesn't remove task index
from itself when task is finished
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19458
retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19458
There's a UT failure
(https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/83014/testReport/junit/org.apache.spark.storage/BlockIdSuite/test_bad_deserialization/).
@superbobry
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19519#discussion_r146737263
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala ---
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19519#discussion_r146734075
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala ---
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18492#discussion_r146190420
--- Diff:
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
@@ -373,8 +373,14 @@ private[spark] class ExecutorAllocationManager
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19554
@sjrand would you please close this PR, it is already merged to branch 2.2.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19554
Thanks, merging to branch 2.2.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19554
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19554
Can you please add a tag in PR title `[BACKPORT-2.2]`.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19554
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19534
@sitalkedia I'm OK with either.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19519#discussion_r146154530
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkApplication.scala ---
@@ -0,0 +1,55 @@
+/*
+ * Licensed to the Apache Software
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19540
@sjrand , can you please create another PR against branch-2.2, it is not
auto-mergeable, thanks!
---
-
To unsubscribe, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19540
LGTM, merging to master and branch 2.2.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19519
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19540
I think branch 2.2 also has similar issue when fetching resources from
remote secure HDFS.
---
-
To unsubscribe, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19540
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19540
Thanks for the fix! I didn't test on secure cluster when did glob path
support, so I didn't realize such issue
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19534
@sitalkedia I have a very old similar PR #11205 , maybe you can refer to it.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19469
@felixcheung As you can see there's bunch of configurations needs to be
added here in https://github.com/apache-spark-on-k8s/spark/pull/516, that's why
I'm asking a general solutions
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19519
@vanzin , how do we leverage this new trait, would you please explain more?
Thanks!
---
-
To unsubscribe, e-mail: reviews
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19534
@sitalkedia would you please fix the PR title, seems it is broken now.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19509
LGTM, merging to master.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19469
@ssaavedra , yes I think so. with the pull-in of k8s support, I would guess
more configurations need to be added to exclusion rule. With current solution,
one by one PR doesn't make so sense. We
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19509
LGTM, just one minor comment.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19509#discussion_r145329972
--- Diff:
resource-managers/yarn/src/main/scala/org/apache/spark/deploy/yarn/config.scala
---
@@ -347,6 +347,10 @@ package object config
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19509
I see, thanks for the explanation. I didn't think about such scenario.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19263
@vanzin, do you have other comments?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19469
@ChenjunZou did you get a chance to look at my left comment?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19509
>The effect of this change is that now it's possible to initialize multiple,
non-concurrent SparkContext instances in the same JVM.
@vanzin , do we support in now? As I remembe
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19476
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145013312
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -653,15 +663,34 @@ private[spark] class BlockManager(
require
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145011923
--- Diff:
core/src/test/scala/org/apache/spark/storage/BlockManagerSuite.scala ---
@@ -509,11 +508,10 @@ class BlockManagerSuite extends SparkFunSuite
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145011775
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -684,7 +713,7 @@ private[spark] class BlockManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145010440
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -653,15 +663,34 @@ private[spark] class BlockManager(
require
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145009567
--- Diff: core/src/main/scala/org/apache/spark/SparkConf.scala ---
@@ -662,7 +662,9 @@ private[spark] object SparkConf extends Logging
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r145009167
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -653,15 +663,34 @@ private[spark] class BlockManager(
require
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19419
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19396
Sorry for the late response. I understand you purpose now. I think such
behavior discrepancy is not a big problem.
I guess the reason why NM still run with exception is that NM doesn't
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r144775941
--- Diff: docs/security.md ---
@@ -186,7 +186,54 @@ configure those ports.
+### HTTP Security Headers
+
+Apache Spark can
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144775481
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -1552,4 +1582,65 @@ private[spark] object BlockManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144770999
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -1552,4 +1582,65 @@ private[spark] object BlockManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144769226
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -1552,4 +1582,65 @@ private[spark] object BlockManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144765076
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -1552,4 +1582,65 @@ private[spark] object BlockManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144764761
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -355,11 +355,21 @@ package object config {
.doc
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144763884
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -355,11 +355,21 @@ package object config {
.doc
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19419
>/home/jenkins/workspace/SparkPullRequestBuilder@2/core/src/main/scala/org/apache/spark/internal/config/package.scala:440:0:
Whitespace at end of line
Please fix the style is
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19419
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r144488504
--- Diff: docs/configuration.md ---
@@ -2013,7 +2013,62 @@ Apart from these, the following properties are also
available, and may be useful
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19464#discussion_r144472362
--- Diff: core/src/test/scala/org/apache/spark/FileSuite.scala ---
@@ -510,4 +510,86 @@ class FileSuite extends SparkFunSuite with
LocalSparkContext
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19464#discussion_r144472244
--- Diff: core/src/test/scala/org/apache/spark/FileSuite.scala ---
@@ -510,4 +510,86 @@ class FileSuite extends SparkFunSuite with
LocalSparkContext
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144456817
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -426,4 +426,11 @@ package object config {
.toSequence
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19476#discussion_r144453507
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -426,4 +426,11 @@ package object config {
.toSequence
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19476
@cloud-fan @jiangxb1987 @jinxing64 would you please help to review when you
have time, thanks!
---
-
To unsubscribe, e-mail
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19458#discussion_r144450997
--- Diff:
core/src/main/scala/org/apache/spark/storage/DiskBlockManager.scala ---
@@ -100,7 +100,16 @@ private[spark] class DiskBlockManager(conf
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19263#discussion_r19147
--- Diff: docs/configuration.md ---
@@ -714,6 +714,13 @@ Apart from these, the following properties are also
available, and may be useful
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19263#discussion_r18138
--- Diff:
core/src/main/scala/org/apache/spark/internal/config/package.scala ---
@@ -41,6 +41,22 @@ package object config {
.bytesConf
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r144286844
--- Diff: core/src/main/scala/org/apache/spark/ui/JettyUtils.scala ---
@@ -79,6 +79,9 @@ private[spark] object JettyUtils extends Logging {
val
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r144283398
--- Diff: core/src/main/scala/org/apache/spark/ui/JettyUtils.scala ---
@@ -89,6 +92,13 @@ private[spark] object JettyUtils extends Logging
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19458
Yes, I agree in any case it should not throw an exception. But in this PR
you filtered out temp shuffle/local blocks, do you think this block is valid or
not, are they blocks?
So I'd
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19458
Instead of filtering out temp blocks, why not adding parsing rule for
`TempLocalBlockId` and `TempShuffleBlockId`? That could also solve the problem.
Since `DiskBlockManager#getAllFiles` doesn't
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r144186220
--- Diff: conf/spark-defaults.conf.template ---
@@ -25,3 +25,10 @@
# spark.serializer
org.apache.spark.serializer.KryoSerializer
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19419
@vanzin @tgravescs @ajbozarth what is your opinion on this PR? Is it a
necessary fix for Spark?
---
-
To unsubscribe, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19468#discussion_r144182701
--- Diff: pom.xml ---
@@ -2649,6 +2649,13 @@
+ kubernetes
--- End diff --
We should also change the sbt
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19464
IIUC this issue also existed in `NewHadoopRDD` and `FileScanRDD`
(possibly), we'd better also fix them.
---
-
To unsubscribe
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19464#discussion_r144181321
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -196,7 +196,10 @@ class HadoopRDD[K, V](
// add the credentials here
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/19476
[SPARK-22062][CORE] Spill large block to disk in BlockManager's remote
fetch to avoid OOM
## What changes were proposed in this pull request?
In the current BlockManager's
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19469
There's a similar PR #19427 , I was wondering if we can provide a general
solution for such issues, like using a configuration to specify all the confs
which needs to be reloaded
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19466
Would you please show us an example of how it breaks? The codes here which
assigning all resources to local ones might work, but it covers which line is
really broken, can you please describe
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19399
I agree with @squito that the criteria to define application's success
should be well considered. Here in your current code, only if all the jobs are
successful then the application is marked
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19287
LGTM, merging to master. Thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19077#discussion_r143380706
--- Diff:
sql/catalyst/src/main/scala/org/apache/spark/sql/catalyst/expressions/complexTypeCreator.scala
---
@@ -116,9 +116,10 @@ private [sql] object
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19287
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/19419#discussion_r143377794
--- Diff: core/src/main/scala/org/apache/spark/ui/JettyUtils.scala ---
@@ -79,6 +79,9 @@ private[spark] object JettyUtils extends Logging {
val
601 - 700 of 2761 matches
Mail list logo