Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4039#issuecomment-70014846
can you add a unit test for what this fixes? I don't see how this avoids
the exceptions, just seems to push them down into `MutableValue.update`. A
test case would
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4043#issuecomment-70015231
lgtm.
I was going to suggest that pending stages should be sorted with oldest
submission time first, not reversed ... but I guess we want the completed
stages
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4048#issuecomment-70017200
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4039#issuecomment-70036236
I think finding & fixing a bug in current behavior is a great reason to add
a unit test. Some part of the implementation is confusing enough to have
allowed a bug in
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4039#issuecomment-70036427
btw, while you're mucking around in there ... it might be nice to change
the `SpecificMutableRow` constructor to take varargs. Change this constr
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4022#issuecomment-70036623
lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4039#issuecomment-70037269
Back to the question of something deeper being wrong ...
I think we'll need to wait for input from somebody more familiar w/ this
code. @marmbrus ?
Bu
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4048#issuecomment-70181484
so, this doesn't actually work quite the way I wanted it to. It turns out
its skipping all the Junit tests as well. The junit tests are run if you run
with `test
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4048#issuecomment-70292578
@pwendell I like the idea of just getting tests to run faster in general,
but I think its gonna be hard to make that happen. (Not the most exciting
tasks for beginners
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4139#issuecomment-71303531
Jenkins retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4187#issuecomment-71323376
Jenkins this is ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4187#issuecomment-71323709
Thanks @MickDavies ! thanks for investigating and also putting the
performance comparison into the jira. I think the code looks fine, but I'm not
super-familiar w/
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4187#discussion_r23501217
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -426,6 +423,33 @@ private[parquet] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4187#discussion_r23501242
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -426,6 +423,33 @@ private[parquet] class
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4175#issuecomment-71348013
Jenkins this is OK to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4175#issuecomment-71348093
this is mentioned in the jira, but its worth noting again here that this
changes the behavior slightly, since it wouldn't throw an exception before.
---
If your pr
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4175#issuecomment-71348345
lets try this again ...
Jenkins this is OK to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4187#discussion_r23504700
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -426,6 +423,33 @@ private[parquet] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4187#discussion_r23505185
--- Diff:
sql/core/src/main/scala/org/apache/spark/sql/parquet/ParquetConverter.scala ---
@@ -426,6 +423,33 @@ private[parquet] class
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4048#issuecomment-71512674
I figured out the magic combination to make sbt, scalatest, junit, and the
sbt-pom-reader all play nicely together. I had to introduce a new config (or
scope or
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4050#discussion_r23555820
--- Diff: core/src/main/scala/org/apache/spark/rdd/HadoopRDD.scala ---
@@ -218,13 +219,14 @@ class HadoopRDD[K, V](
// Find a function that
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4141#discussion_r23557845
--- Diff:
yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala ---
@@ -192,15 +186,32 @@ private[yarn] class YarnAllocator
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4141#issuecomment-71523335
just a minor comment, otherwise lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/2333#issuecomment-71532457
Hi @sarutak thanks for your work on this. Josh's other PR
https://github.com/apache/spark/pull/2696 has been merged for a while now. I'm
gonna take anothe
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4187#issuecomment-71534196
thanks for all the extra detail @MickDavies.
lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/3798#issuecomment-71546698
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/3798#discussion_r23569836
--- Diff:
external/kafka/src/main/scala/org/apache/spark/streaming/kafka/DeterministicKafkaInputDStream.scala
---
@@ -0,0 +1,149 @@
+/*
+ * Licensed
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/3798#discussion_r23570025
--- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
@@ -788,6 +788,20 @@ abstract class RDD[T: ClassTag
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4175#issuecomment-71551408
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/3798#issuecomment-71555904
I'm not very knowledgeable about streaming, but from my limited perspective
it looks good
---
If your project is set up for it, you can reply to this email and have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4175#issuecomment-71577489
I think these failures are real, looks like you need to do a similar
updating of the args to `registerTempTable` in the pyspark tests, eg.
[here](https://github.com
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/3798#issuecomment-71577982
@koeninger
I doubt that we want to go this route in this case, but just in case you're
interested, I think a much better way to handle multiple errors graceful
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4214#discussion_r23652649
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -163,9 +179,6 @@ private[history] class FsHistoryProvider(conf
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4214#discussion_r23652781
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -113,12 +129,12 @@ private[history] class FsHistoryProvider(conf
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4204#discussion_r23694090
--- Diff:
core/src/main/scala/org/apache/spark/storage/LocalFileSystem.scala ---
@@ -0,0 +1,58 @@
+/*
+ * Licensed to the Apache Software Foundation
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4220#issuecomment-71855275
retest this please
hopefully those test failures were random, lets see. btw, I think that if
you want the exact same patch applied to multiple branches, the
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4249#discussion_r23696141
--- Diff:
sql/hive/src/test/scala/org/apache/spark/sql/hive/execution/HiveQuerySuite.scala
---
@@ -63,6 +63,37 @@ class HiveQuerySuite extends
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4155#discussion_r23703042
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/OutputCommitCoordinatorSuite.scala
---
@@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4155#discussion_r23704462
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/OutputCommitCoordinatorSuite.scala
---
@@ -0,0 +1,177 @@
+/*
+ * Licensed to the Apache
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4155#issuecomment-71883720
I worry about how complicated the test is, and how much it needs to muck
around with internals ... it may be hard to keep up to date as those internals
change. And, I
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4220#issuecomment-71884820
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4082#discussion_r23708804
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -369,7 +369,7 @@ private[spark] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23709166
--- Diff: core/src/main/scala/org/apache/spark/deploy/DeployMessage.scala
---
@@ -148,15 +148,22 @@ private[deploy] object DeployMessages
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23709574
--- Diff: core/src/main/scala/org/apache/spark/deploy/master/Master.scala
---
@@ -121,6 +122,17 @@ private[spark] class Master(
throw new
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23710115
--- Diff: core/src/main/scala/org/apache/spark/deploy/SparkSubmit.scala ---
@@ -71,21 +81,64 @@ object SparkSubmit {
if (appArgs.verbose
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4082#discussion_r23712334
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -369,7 +369,7 @@ private[spark] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23724419
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolMessage.scala
---
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4220#issuecomment-71923273
I kinda see what is going on with the tests now. A [test case in
SparkSubmitSuite](https://github.com/apache/spark/blob/master/core/src/test/scala/org/apache/spark/deploy
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4216#issuecomment-71924525
I made a comment at one spot in the code, but throughout I find the name
"stable" confusing. It implies the other one is "unstable", and without t
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23735817
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolMessage.scala
---
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4220#discussion_r23776203
--- Diff:
core/src/test/scala/org/apache/spark/util/ResetSystemProperties.scala ---
@@ -42,7 +43,7 @@ private[spark] trait ResetSystemProperties extends
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4220#issuecomment-72079687
lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4220#issuecomment-72082093
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4271#issuecomment-72083678
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4216#discussion_r23799173
--- Diff:
core/src/main/scala/org/apache/spark/deploy/rest/SubmitRestProtocolMessage.scala
---
@@ -0,0 +1,201 @@
+/*
+ * Licensed to the Apache
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4271#issuecomment-72099030
more real failures. Looks like you need to add that param in a few more
test cases still.
I will reiterate -- if we flip the default value, its no longer an api
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/2851#issuecomment-74698831
Hi @CodingCat
thanks for making all the updates. Sorry I hadn't realized the subtlety w/
`Int` vs `Long` on the `RDDBlockId` and `BroadcastBlockId`. Sti
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24829369
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SparkListenerBus.scala ---
@@ -24,7 +24,13 @@ import org.apache.spark.util.ListenerBus
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24830276
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SparkListenerBus.scala ---
@@ -24,7 +24,13 @@ import org.apache.spark.util.ListenerBus
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24831053
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala ---
@@ -522,7 +523,9 @@ private[spark] class BlockManagerInfo
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24831483
--- Diff:
core/src/main/scala/org/apache/spark/storage/BlockManagerMasterActor.scala ---
@@ -522,7 +523,9 @@ private[spark] class BlockManagerInfo
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24831712
--- Diff: core/src/main/scala/org/apache/spark/storage/RDDInfo.scala ---
@@ -21,13 +21,14 @@ import org.apache.spark.annotation.DeveloperApi
import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24833405
--- Diff: core/src/main/scala/org/apache/spark/storage/RDDInfo.scala ---
@@ -49,9 +50,40 @@ class RDDInfo(
}
}
+
private[spark
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24833566
--- Diff: core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
---
@@ -271,4 +368,19 @@ private[spark] object StorageUtils
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24833690
--- Diff:
core/src/main/scala/org/apache/spark/ui/storage/InMemoryObjectPage.scala ---
@@ -0,0 +1,123 @@
+/*
+ * Licensed to the Apache Software
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r24835658
--- Diff:
core/src/main/scala/org/apache/spark/ui/storage/BroadcastPage.scala ---
@@ -0,0 +1,90 @@
+/*
+ * Licensed to the Apache Software Foundation
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/2851#issuecomment-74713997
can you also post a screenshot of the detailed page for a broadcast var?
Ideally involving a broadcast var that gets turned into multiple blocks by
`TorrentBroadcast`, I
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4629#discussion_r24852684
--- Diff: python/pyspark/tests.py ---
@@ -740,6 +739,27 @@ def test_multiple_python_java_RDD_conversions(self):
converted_rdd = RDD
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4629#discussion_r24861247
--- Diff: python/pyspark/tests.py ---
@@ -740,6 +739,27 @@ def test_multiple_python_java_RDD_conversions(self):
converted_rdd = RDD
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4435#issuecomment-75261729
Hi @andrewor14 , there is already a doc on the JIRA:
https://issues.apache.org/jira/secure/attachment/12695540/sparkmonitoringjsondesign.pdf
I changed the names
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4688#discussion_r25485041
--- Diff: yarn/src/main/scala/org/apache/spark/deploy/yarn/Client.scala ---
@@ -540,6 +561,29 @@ private[spark] class Client(
amContainer
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4688#discussion_r25485489
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala
---
@@ -71,6 +72,16 @@ class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4450#discussion_r25627160
--- Diff:
core/src/main/scala/org/apache/spark/util/collection/ChainedBuffer.scala ---
@@ -0,0 +1,134 @@
+/*
+ * Licensed to the Apache Software
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/4857
[wip][SPARK-1391][SPARK-3151] 2g partition limit
https://issues.apache.org/jira/browse/SPARK-1391
This is still really rough, I'm looking for some feedback on overall
design, its not
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4857#discussion_r25641946
--- Diff:
core/src/main/scala/org/apache/spark/network/netty/NettyBlockRpcServer.scala ---
@@ -63,11 +83,83 @@ class NettyBlockRpcServer
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4857#discussion_r25642017
--- Diff:
core/src/main/scala/org/apache/spark/network/netty/NettyBlockTransferService.scala
---
@@ -106,40 +109,54 @@ class NettyBlockTransferService(conf
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4857#discussion_r25642290
--- Diff:
core/src/main/scala/org/apache/spark/network/nio/NioBlockTransferService.scala
---
@@ -143,7 +143,7 @@ final class NioBlockTransferService(conf
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4857#discussion_r25644640
--- Diff:
network/common/src/main/java/org/apache/spark/network/buffer/NioManagedBuffer.java
---
@@ -41,13 +41,13 @@ public long size
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4857#discussion_r25644794
--- Diff:
core/src/test/scala/org/apache/spark/network/netty/NettyBlockTransferSuite.scala
---
@@ -0,0 +1,154 @@
+/*
+ * Licensed to the Apache
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/4877
[SPARK-5949] HighlyCompressedMapStatus needs more classes registered w/ kryo
https://issues.apache.org/jira/browse/SPARK-5949
You can merge this pull request into a Git repository by running
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4877#discussion_r25724532
--- Diff:
core/src/test/scala/org/apache/spark/serializer/KryoSerializerSuite.scala ---
@@ -23,8 +23,10 @@ import scala.reflect.ClassTag
import
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4877#issuecomment-77030736
ok I think I fixed the style issues
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4877#discussion_r25727630
--- Diff:
core/src/test/scala/org/apache/spark/serializer/KryoSerializerSuite.scala ---
@@ -23,8 +23,10 @@ import scala.reflect.ClassTag
import
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4857#issuecomment-77946339
closing since this is getting broken into different chunks of work.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4435#issuecomment-77946570
Hi @JoshRosen , thanks for the review. Sorry for all the style details
wrong -- I will take a closer look at all the code, just wanted to get some
validation of the
Github user squito closed the pull request at:
https://github.com/apache/spark/pull/4857
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4964#issuecomment-78319536
I was just discussing this w/ Hari a bit, the one issue w/ this PR is
having a test case to demonstrate the old problem, verify the fix, & prevent
regression. We
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26391448
--- Diff: core/src/main/scala/org/apache/spark/HeartbeatReceiver.scala ---
@@ -67,10 +67,8 @@ private[spark] class HeartbeatReceiver(sc: SparkContext
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26391837
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -181,6 +181,12 @@ private[spark] class EventLoggingListener
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26392169
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -357,7 +357,13 @@ private[spark] class BlockManager(
info
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393024
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/EventLoggingListener.scala ---
@@ -181,6 +181,12 @@ private[spark] class EventLoggingListener
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393202
--- Diff:
core/src/main/scala/org/apache/spark/storage/StorageStatusListener.scala ---
@@ -88,4 +88,19 @@ class StorageStatusListener extends SparkListener
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393526
--- Diff: core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
---
@@ -101,25 +122,30 @@ class StorageStatus(val blockManagerId:
BlockManagerId
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393712
--- Diff: core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
---
@@ -166,28 +195,69 @@ class StorageStatus(val blockManagerId:
BlockManagerId
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26394350
--- Diff: core/src/main/scala/org/apache/spark/ui/storage/StorageTab.scala
---
@@ -39,11 +40,18 @@ private[ui] class StorageTab(parent: SparkUI) extends
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393960
--- Diff: core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
---
@@ -224,8 +299,14 @@ class StorageStatus(val blockManagerId:
BlockManagerId
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26394598
--- Diff: core/src/main/scala/org/apache/spark/ui/storage/StorageTab.scala
---
@@ -79,4 +87,19 @@ class StorageListener(storageStatusListener
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26393684
--- Diff: core/src/main/scala/org/apache/spark/storage/StorageUtils.scala
---
@@ -166,28 +195,69 @@ class StorageStatus(val blockManagerId:
BlockManagerId
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/2851#discussion_r26394719
--- Diff: core/src/main/scala/org/apache/spark/ui/storage/StorageTab.scala
---
@@ -79,4 +87,19 @@ class StorageListener(storageStatusListener
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/2851#issuecomment-79050227
Hi @CodingCat , I took another look through everything. My comments are
all very minor. But I think I'd still like to get some more thoughts from
others on ho
1201 - 1300 of 3535 matches
Mail list logo