Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18230
@saturday-shi would you please update the title to track SPARK-19688?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your projec
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18210#discussion_r120816418
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/BlacklistTracker.scala ---
@@ -336,9 +336,9 @@ private[scheduler] object BlacklistTracker extends
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18210#discussion_r120816560
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/BlacklistTracker.scala ---
@@ -336,9 +336,9 @@ private[scheduler] object BlacklistTracker extends
Github user wzhfy commented on a diff in the pull request:
https://github.com/apache/spark/pull/12646#discussion_r120816635
--- Diff: sql/core/src/test/scala/org/apache/spark/sql/SQLQuerySuite.scala
---
@@ -2651,4 +2652,28 @@ class SQLQuerySuite extends QueryTest with
SharedSQLCon
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18230
I guess "spark.yarn.credentials.renewalTime" and
"spark.yarn.credentials.updateTime" should also be excluded.
---
If your project is set up for it, you can reply to this email and have your
reply
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/18213
@mridulm , the exit code of pyspark or R is really user defined, user could
exit with any code, for example `sys.exit(100)`, so potentially it could be
overlapping.
---
If your project is set up
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18212#discussion_r120823962
--- Diff: conf/spark-env.sh.template ---
@@ -23,6 +23,7 @@
# Options read when launching programs locally with
# ./bin/run-example or ./bin/spark-su
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18213
**[Test build #77809 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77809/testReport)**
for PR 18213 at commit
[`1559cbb`](https://github.com/apache/spark/commit/15
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18231
That's 12 bytes. Are there millions of these?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this featu
Github user BartekH commented on the issue:
https://github.com/apache/spark/pull/17985
@ HyukjinKwon Done.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so,
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18213
**[Test build #77809 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77809/testReport)**
for PR 18213 at commit
[`1559cbb`](https://github.com/apache/spark/commit/1
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18213
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77809/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18213
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
e
GitHub user masterwugui opened a pull request:
https://github.com/apache/spark/pull/18237
Create JavaRegressionMetricsExample.java
the original code cant visit the last element of the"parts" array.
so the v[v.lengthâ1] always equals 0
## What changes were proposed in t
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18237
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
Actually it's more than 12 bytes.
Yes, there are millions of these. In my heap dump, it's 1.5 G
---
If your project is set up for it, you can reply to this email and have your
reply appear on
GitHub user 10110346 opened a pull request:
https://github.com/apache/spark/pull/18238
[SPARK-21016][core]Improve code fault tolerance for converting string to
number
## What changes were proposed in this pull request?
When converting `string` to `number`(int, long or double),
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18238
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
GitHub user jinxing64 opened a pull request:
https://github.com/apache/spark/pull/18239
[SPARK-19462] fix bug in Exchange--pass in a tmp "newPartitioning" in
"prepareShuffleDependency"
When `spark.sql.adaptive.enabled` is true, any rerunning of ancestors of
`ShuffledRowRDD` will fa
GitHub user eatoncys opened a pull request:
https://github.com/apache/spark/pull/18240
[SPARK-21017][sql]Move the length getter before the while to improve
performance
## What changes were proposed in this pull request?
In my test, the cost of the while in the write functio
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18239
I'm not sure if it is appropriate to make this pr and backport to 1.6. It's
great if there's someone taking some time reviewing this.
---
If your project is set up for it, you can reply to this e
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18240
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18239
**[Test build #77810 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77810/consoleFull)**
for PR 18239 at commit
[`debd107`](https://github.com/apache/spark/commit/d
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18240#discussion_r120837951
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/HiveFileFormat.scala
---
@@ -142,7 +142,8 @@ class HiveOutputWriter(
over
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18237
You didn't fill in the title or description here. The change is wrong
anyway because v is indexed by i-1 and so the last element in v is filled in.
---
If your project is set up for it, you can repl
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/17276
@mridulm @squito
Thanks a lot for taking time review this pr.
I will close it for now and make another one if there is progress.
---
If your project is set up for it, you can reply to thi
Github user jinxing64 closed the pull request at:
https://github.com/apache/spark/pull/17276
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user eatoncys commented on a diff in the pull request:
https://github.com/apache/spark/pull/18240#discussion_r120840468
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/HiveFileFormat.scala
---
@@ -142,7 +142,8 @@ class HiveOutputWriter(
ov
Github user eatoncys closed the pull request at:
https://github.com/apache/spark/pull/18240
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is e
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18231#discussion_r120841272
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockHandler.java
---
@@ -209,4 +190,52 @@ private ShuffleMet
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18231#discussion_r120841240
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockHandler.java
---
@@ -209,4 +190,52 @@ private ShuffleMet
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18231#discussion_r120841339
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockHandler.java
---
@@ -209,4 +190,52 @@ private ShuffleMet
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18220#discussion_r120842205
--- Diff: docs/running-on-mesos.md ---
@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for
information on Spark config
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18220#discussion_r120842136
--- Diff: docs/running-on-mesos.md ---
@@ -469,6 +470,15 @@ See the [configuration page](configuration.html) for
information on Spark config
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18220#discussion_r120842358
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
---
@@ -0,0 +1,94 @@
+/*
+ * License
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18220#discussion_r120842610
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
---
@@ -0,0 +1,94 @@
+/*
+ * License
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18220#discussion_r120842747
--- Diff:
resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosProtoUtils.scala
---
@@ -0,0 +1,94 @@
+/*
+ * License
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18234
@vanzin I think it's appropriate to attach this to the existing issue,
because it's inherently connected to any other changes that follow. We can
definitely un-mark it as In Progress.
---
If your p
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18229
**[Test build #3784 has
started](https://amplab.cs.berkeley.edu/jenkins/job/NewSparkPullRequestBuilder/3784/testReport)**
for PR 18229 at commit
[`dd9cc4d`](https://github.com/apache/spark/commit/d
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
@srowen Sorry, I didn't make it clear.
1. In current code, all blockIds are stored in the iterator. They are
released only when the iterator is traversed.
2. Now I change the `String` to b
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18231
The current iterator doesn't have any state except for an int. What are you
referring to?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as we
Github user jinxing64 commented on a diff in the pull request:
https://github.com/apache/spark/pull/18231#discussion_r120844431
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockHandler.java
---
@@ -209,4 +190,52 @@ private Shuffle
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18210#discussion_r120844928
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/BlacklistTracker.scala ---
@@ -336,9 +336,9 @@ private[scheduler] object BlacklistTracker extends
L
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
I mean the blockIds in `OpenBlocks`, they have reference in iterator.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your projec
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18231
I get it. But that doesn't make the reference in OpenBlocks go away. This
only helps anything is msg/msgObj can be garbage collected earlier. Is that the
case? right now this is allocating additional
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18198
Any comments @jkbradley ? I imagine you approve, now that tests pass.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project d
Github user jinxing64 commented on a diff in the pull request:
https://github.com/apache/spark/pull/18231#discussion_r120845706
--- Diff:
common/network-shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockHandler.java
---
@@ -209,4 +190,52 @@ private Shuffle
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18215
Merged to master/2.2/2.1
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
The blockIds cannot be freed because they are referenced in the iterator.
In current change they are not. We reference the mapIdAndReduceIds instead.
Thus the blockIds in OpenBlocks can be garbage
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18226
Merged to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or i
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18215
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18231
That's not the question though. The question is whether they could be freed
even after this change. msg still references it. That's what you need to
establish, if only by some empirical testing.
--
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18226
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18231
**[Test build #77811 has
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77811/testReport)**
for PR 18231 at commit
[`1e53262`](https://github.com/apache/spark/commit/1e
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
there is no where referencing `msg`, right? I guess the `msg` will be
garbage collected fluently.
---
If your project is set up for it, you can reply to this email and have your
reply appear on G
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18231
I'm not clear that's true, no. Not, at least, in the lifetime of the
iterator. That's what has to be true for this to help anything. Do you have
evidence this is true? for example if you have tests t
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/18210#discussion_r120848724
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/BlacklistTracker.scala ---
@@ -336,9 +336,9 @@ private[scheduler] object BlacklistTracker extends
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
Yes, I think it's great to do some tests and give a good evidence.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project d
Github user srowen commented on the issue:
https://github.com/apache/spark/pull/18223
I'll merge this. If anything got closed that anyone disagrees with, it can
easily be reopened, so there is virtually no harm in a false-positive anyway.
But looks like the list has been reasonably re
GitHub user guoxiaolongzte opened a pull request:
https://github.com/apache/spark/pull/18241
[SPARK-20997][CORE]'--driver-cores' standalone or Mesos or YARN in Cluster
deploy mode only.
## What changes were proposed in this pull request?
'--driver-cores' standalone or Mes
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18241
Can one of the admins verify this patch?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feat
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18044
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/11459
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18124
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17926
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18010
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/13833
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/12252
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17689
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18045
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17791
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17638
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17640
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18061
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/13720
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/12456
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18163
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/15831
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/12506
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/14036
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/14461
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18041
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user srowen commented on a diff in the pull request:
https://github.com/apache/spark/pull/18241#discussion_r120850448
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -558,8 +558,9 @@ private[deploy] class SparkSubmitArguments(args:
Se
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/16291
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/12217
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18222
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/14995
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/12835
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18223
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/18130
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17480
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user asfgit closed the pull request at:
https://github.com/apache/spark/pull/17141
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is ena
Github user guoxiaolongzte commented on a diff in the pull request:
https://github.com/apache/spark/pull/18212#discussion_r120850820
--- Diff: conf/spark-env.sh.template ---
@@ -23,6 +23,7 @@
# Options read when launching programs locally with
# ./bin/run-example or ./bin/
Github user guoxiaolongzte commented on a diff in the pull request:
https://github.com/apache/spark/pull/18241#discussion_r120851133
--- Diff:
core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala ---
@@ -558,8 +558,9 @@ private[deploy] class SparkSubmitArguments(a
Github user SparkQA commented on the issue:
https://github.com/apache/spark/pull/18239
**[Test build #77810 has
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/77810/consoleFull)**
for PR 18239 at commit
[`debd107`](https://github.com/apache/spark/commit/
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18239
Test PASSed.
Refer to this link for build results (access rights to CI server needed):
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/77810/
Test PASSed.
---
Github user AmplabJenkins commented on the issue:
https://github.com/apache/spark/pull/18239
Merged build finished. Test PASSed.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
e
Github user saturday-shi commented on the issue:
https://github.com/apache/spark/pull/18230
> I guess "spark.yarn.credentials.renewalTime" and
"spark.yarn.credentials.updateTime" should also be excluded.
Thank you for pointing out that. I'll check & fix them.
---
If your pro
Github user saturday-shi commented on the issue:
https://github.com/apache/spark/pull/18230
@jerryshao I've taken a look at `spark.yarn.credentials.renewalTime` and
`spark.yarn.credentials.updateTime`, but I don't think there is a necessity for
excluding them. Changing on these proper
Github user jinxing64 commented on the issue:
https://github.com/apache/spark/pull/18231
@srowen
I did a test to verify this patch.
I wrap a number of blocks inside `OpenBlocks` and send it to
`ExternalShuffleBlockHandler`.
With this change:
it cost about 133M in the
1 - 100 of 356 matches
Mail list logo