Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20474#discussion_r165548039
--- Diff:
core/src/main/scala/org/apache/spark/status/api/v1/OneApplicationResource.scala
---
@@ -51,6 +52,46 @@ private[v1] class
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20474#discussion_r165547727
--- Diff:
core/src/main/scala/org/apache/spark/status/api/v1/OneApplicationResource.scala
---
@@ -51,6 +52,46 @@ private[v1] class
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20460
> I'd target this 2.3 & master. Waiting for tests
@felixcheung is it too risky to target to 2.3, this is a fundamental
behavior change. We should make sure k8s could well use fra
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20460
I think here
(https://github.com/apache/spark/blob/032c11b83f0d276bf8085992229b8c598f02798a/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala#L117)
should also be fixed
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
I agree. sorry to merge it so quickly, let me revert it.
@ssaavedra would you please submit PR again when everything is done, thanks
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r165252724
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r165249764
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r165240426
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
@ssaavedra Do you mean the your current patch doesn't work even for master
branch? In case of that do we need to revert the current patch? CC @felixcheung
@foxish
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Thanks all for your review, greatly appreciated.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r164997663
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Thanks Felix. I would incline to not fix the case mentioned by Felix.
What's your opinion @HyukjinKwon @ueshin
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r164976661
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r164973745
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20437#discussion_r164968292
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/dstream/FileInputDStream.scala
---
@@ -157,7 +157,7 @@ class FileInputDStream[K, V, F
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
@felixcheung what is your opinion on this, do we really need to handle this
case?
---
-
To unsubscribe, e-mail: reviews
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20447
Merging to master and 2.3. Thanks for the review!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20382
Sure, will waiting for others to be merged, thanks @tdas .
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r164944276
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/socket.scala
---
@@ -47,130 +48,141 @@ object TextSocketSource
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20447
CC @tdas , please help to review. Thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/20447
[SPARK-23279][SS] Avoid triggering distributed job for Console sink
## What changes were proposed in this pull request?
Console sink will redistribute collected local data and trigger
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r164934753
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
---
@@ -56,7 +58,7 @@ trait ConsoleWriter extends
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r164933597
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/socket.scala
---
@@ -47,130 +48,141 @@ object TextSocketSource
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
I think the same issue also existed in Scala `SparkSession` code, because
`setDefaultSession` doesn't hold a lock which holds by `getOrCreate`
(SparkSession).
For example
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Thanks all for your comments. I think @felixcheung 's case really makes
thing complex, I'm not sure if user will use it in such way. I will try to
address it. Appreciate your comments
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Hi all, can you please review again, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20422
I agree with @squito , unless there's a bug in it, it is risky and
unnecessary to change the logic in this critical path
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20335
Since this is a behavior change compared to 2.2/2.3, so I will only merge
to master branch. Thanks @pmackles
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20272
>IIUC there was a issue in launching Thrift Server on YARN cluster mode,
and I'm not sure whether it has been fixed (maybe @jerryshao can kindly check
that?)
Sorry I cannot remem
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20422
I think it is necessary to add unit test to verify the changes.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164364573
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +230,9 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164364286
--- Diff: python/pyspark/sql/session.py ---
@@ -213,7 +213,12 @@ def __init__(self, sparkContext, jsparkSession=None):
self._jsc = self._sc
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
Thanks @HyukjinKwon for your help.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164350646
--- Diff: python/pyspark/sql/session.py ---
@@ -760,6 +764,7 @@ def stop(self):
"""Stop the underlying :clas
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164350171
--- Diff: python/pyspark/sql/session.py ---
@@ -760,6 +764,7 @@ def stop(self):
"""Stop the underlying :clas
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20335
CC @ajbozarth .\
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164328055
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +225,7 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
@ssaavedra why don't you submit a follow-up PR to remove the inexistent
configuration as mentioned above?
If we agreed to backport to 2.3, then you can create another backport PR
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164109814
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +225,7 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164108216
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +225,7 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20383#discussion_r164106372
--- Diff:
streaming/src/main/scala/org/apache/spark/streaming/Checkpoint.scala ---
@@ -53,6 +53,21 @@ class Checkpoint(ssc: StreamingContext, val
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20404#discussion_r164105856
--- Diff: python/pyspark/sql/session.py ---
@@ -225,6 +225,7 @@ def __init__(self, sparkContext, jsparkSession=None):
if SparkSession
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20404
@zjffdu @HyukjinKwon please help to review. Thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/20404
[SPARK-23228][PYSPARK] Add Python Created jsparkSession to JVM's
defaultSession
## What changes were proposed in this pull request?
In the current PySpark code, Python created
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
This is not a big issue unless when we run Spark Streaming with checkpoint
enabled. I'm not sure for now it is OK to add to 2.3.0 release (as this is not
a block issue).
@felixcheung up
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
hmmm.. The target version in the JIRA is 2.4.0, and we about to release
2.3.0, are we sure we want to backport
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
Merging to Master, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20383
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20401
Is that all you can find? @dongjoon-hyun
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20382
@zsxwing @tdas would you please help to review, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20401
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20399
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20399
Originally we were using reflection for this `HiveDelegationTokenProvider`.
But in that PR we changed to directly use Hive classes, is there any particular
reason
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20399
So seems `try-catch` mechanism in `HiveDelegationTokenProvider` is not
useful.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20382
Hi @jose-torres , thanks for your reviewing. I tried both the example you
mentioned and simple spark-shell command, I think it works, but the path will
always go to V2 `MicroBatchReader` (still
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20382
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20292
LGTM.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r163755819
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/TextSocketStreamSourceV2.scala
---
@@ -0,0 +1,247
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r163753088
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/TextSocketStreamSourceV2.scala
---
@@ -0,0 +1,247
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r163725096
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/TextSocketStreamSourceV2.scala
---
@@ -0,0 +1,247
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20382
@jose-torres can you please help to review, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20292
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20327#discussion_r163491802
--- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala ---
@@ -385,7 +386,13 @@ class SparkContext(config: SparkConf) extends Logging
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20382#discussion_r163487603
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/execution/streaming/sources/ConsoleWriter.scala
---
@@ -56,7 +58,7 @@ trait ConsoleWriter extends
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/20382
[SPARK-23097][SQL][SS] Migrate text socket source to V2
## What changes were proposed in this pull request?
This PR moves structured streaming text socket source to V2
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20347
Using `getOrCreate` in constructor seems change the semantics. Maybe we can
add a new static method for such usage in `JavaSparkContext
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20347
Can you please explain why do we need to change to `getOrCreate`?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20349
LGTM, merging to master/2.3, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20349
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20349
Can you please search if there're similar issues in the doc?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20259
I kind of agree with @CodingCat , I think we have plenty of third-party
monitoring tools to monitor the availability of Master process, it is not so
necessary to expose here in Master UI
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20298
Merging to master/2.3, thanks for the fix!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20026#discussion_r162803492
--- Diff: core/src/main/scala/org/apache/spark/storage/DiskStore.scala ---
@@ -152,7 +153,7 @@ private class DiskBlockData(
file: File
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20026#discussion_r162610474
--- Diff: core/src/main/scala/org/apache/spark/storage/DiskStore.scala ---
@@ -152,7 +153,7 @@ private class DiskBlockData(
file: File
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/19285
Are we targeting this to 2.3 or 2.4?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20281
@gatorsmile , do you have further comment?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20298
LGTM for the fix.
@zsxwing would you please take another look?
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20298
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20315#discussion_r162338495
--- Diff: core/src/main/scala/org/apache/spark/ui/jobs/StagePage.scala ---
@@ -676,7 +676,7 @@ private[ui] class TaskDataSource(
private var
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/20315
[SPARK-23147][UI] Fix task page table IndexOutOfBound Exception
## What changes were proposed in this pull request?
Stage's task page table will throw an exception when there's
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20305#discussion_r162274760
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionStateBuilder.scala
---
@@ -98,20 +98,7 @@ class HiveSessionStateBuilder(session
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20305#discussion_r162270415
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionStateBuilder.scala
---
@@ -101,6 +102,7 @@ class HiveSessionStateBuilder(session
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20298
Jenkins, retest this please.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20305#discussion_r162249755
--- Diff:
sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveSessionStateBuilder.scala
---
@@ -101,6 +102,7 @@ class HiveSessionStateBuilder(session
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20305
@cloud-fan , please help to review, thanks!
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
GitHub user jerryshao opened a pull request:
https://github.com/apache/spark/pull/20305
[SPARK-23140][SQL] Add DataSourceV2Strategy to Hive Session state's planner
## What changes were proposed in this pull request?
`DataSourceV2Strategy` is missing
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20281
@cloud-fan would you please take another look on this?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20298
ok to test.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20190
Thanks, can you please close this one.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20281
LGTM. So looks like the fix is exactly the same as Hive.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20184
>I think that a lazy buffer allocation can not thoroughly solve this
problem because UnsafeSorterSpillReader has BufferedFileInputStream witch will
allocate off heap memory.
Can
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20184
The code here should be fine for normal case. The problem is that there're
so many spill files, which requires to maintain lots of handler's buffer. A
lazy buffer allocation could solve
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20190
@RussellSpitzer , can you please submit PR against master branch.
---
-
To unsubscribe, e-mail: reviews-unsubscr
Github user jerryshao commented on the issue:
https://github.com/apache/spark/pull/20184
Thanks, let me try to reproduce it locally.
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20183#discussion_r161157993
--- Diff:
core/src/main/scala/org/apache/spark/broadcast/BroadcastManager.scala ---
@@ -52,6 +54,10 @@ private[spark] class BroadcastManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20183#discussion_r161154730
--- Diff:
core/src/main/scala/org/apache/spark/broadcast/BroadcastManager.scala ---
@@ -52,6 +54,10 @@ private[spark] class BroadcastManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20183#discussion_r161151338
--- Diff:
core/src/main/scala/org/apache/spark/broadcast/BroadcastManager.scala ---
@@ -52,6 +54,10 @@ private[spark] class BroadcastManager
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20183#discussion_r161149468
--- Diff:
core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala ---
@@ -206,36 +206,50 @@ private[spark] class TorrentBroadcast[T
Github user jerryshao commented on a diff in the pull request:
https://github.com/apache/spark/pull/20183#discussion_r161147892
--- Diff:
core/src/main/scala/org/apache/spark/broadcast/TorrentBroadcast.scala ---
@@ -206,36 +206,50 @@ private[spark] class TorrentBroadcast[T
401 - 500 of 2740 matches
Mail list logo