Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15505#discussion_r98591359
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskDescription.scala ---
@@ -51,8 +54,39 @@ private[spark] class TaskDescription(
val
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15237
@erenavsarogullari sorry to push back, but if you're willing to do the
filename thing now, why not just tackle it in this same pr? seems pretty minor
to separate into its own issue
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16620
Hi @jinxing64
sorry to go back and forth on this numerous times -- I think I have another
alternative, see https://github.com/squito/spark/tree/SPARK-19263_alternate
Its most of your
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r98703486
--- Diff: core/src/main/scala/org/apache/spark/scheduler/Stage.scala ---
@@ -68,6 +68,12 @@ private[scheduler] abstract class Stage(
/** Set of jobs
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r98703683
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/SchedulerIntegrationSuite.scala
---
@@ -648,4 +660,70 @@ class BasicSchedulerIntegrationSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r98699067
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1212,8 +1223,9 @@ class DAGScheduler
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/16781
[SPARK-12297][SQL][POC] Hive compatibility for Parquet Timestamps
## What changes were proposed in this pull request?
Hive has very strange behavior when writing timestamps to parquet data
Github user squito closed the pull request at:
https://github.com/apache/spark/pull/16781
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16620
Hi @jinxing64
I'm sorry I haven't had time to look again. So the one big concern I had
was still that test case -- I know you fixed up some of the things I complained
about, b
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99474371
--- Diff:
core/src/test/scala/org/apache/spark/deploy/StandaloneDynamicAllocationSuite.scala
---
@@ -489,6 +491,29 @@ class StandaloneDynamicAllocationSuite
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15237
merged to master, thanks @erenavsarogullari
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99596040
--- Diff:
core/src/test/scala/org/apache/spark/deploy/StandaloneDynamicAllocationSuite.scala
---
@@ -467,6 +469,51 @@ class StandaloneDynamicAllocationSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99596922
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/BlacklistTrackerSuite.scala ---
@@ -456,4 +461,69 @@ class BlacklistTrackerSuite extends
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99595765
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/CoarseGrainedSchedulerBackend.scala
---
@@ -600,6 +603,16 @@ class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99597739
--- Diff:
core/src/test/scala/org/apache/spark/deploy/StandaloneDynamicAllocationSuite.scala
---
@@ -467,6 +469,51 @@ class StandaloneDynamicAllocationSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99615906
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -2161,6 +2161,96 @@ class DAGSchedulerSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99615141
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -14,7 +14,7 @@
* See the License for the specific language
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99618127
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/SchedulerIntegrationSuite.scala
---
@@ -648,4 +661,70 @@ class BasicSchedulerIntegrationSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99615666
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -2161,6 +2161,96 @@ class DAGSchedulerSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99615449
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -2161,6 +2161,96 @@ class DAGSchedulerSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99618027
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -2161,6 +2161,96 @@ class DAGSchedulerSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16620#discussion_r99616688
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1191,8 +1191,29 @@ class DAGScheduler(
} else
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16650#discussion_r99664044
--- Diff:
core/src/test/scala/org/apache/spark/deploy/StandaloneDynamicAllocationSuite.scala
---
@@ -467,6 +469,52 @@ class StandaloneDynamicAllocationSuite
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16376
yes, I think this is ready (I just noticed a couple of minor nits with a
fresh read but no real changes)
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16831
lgtm
sorry my fault in not breaking this out in the first place.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16620
@kayousterhout I think your fix is correct, but I actually think its a
bigger change in behavior, one that has been explicitly argued *against* in the
past. I think the idea is that if you've
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16639
> (1) Instead of this approach, did you consider walking through the
exceptions (with getCause()) to see if there's a nested FetchFailure in there?
That seems simpler, with the con of
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16639#discussion_r99950321
--- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala ---
@@ -405,6 +415,13 @@ private[spark] class Executor
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16831
@jinxing64 that way of testing is fine, but I find its much faster to use
sbt.
http://www.scala-sbt.org/0.13/docs/Testing.html
```
build/sbt -Pyarn -Phadoop-2.6 -Phive
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15982
I think the comment on `#mergeSpillsWithFileStreams` needs to be updated
slightly to include encryption, but other than that lgtm.
---
If your project is set up for it, you can reply to this email
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
ping @kayousterhout. I've merged in the other changes so this is up to
date now.
I also did another pass and did a bit of minor cleanup and commenting, I
think in line with stuff
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15237
Jenkins, ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90544307
--- Diff: core/src/test/scala/org/apache/spark/scheduler/PoolSuite.scala ---
@@ -20,15 +20,21 @@ package org.apache.spark.scheduler
import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90547665
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SchedulableBuilder.scala ---
@@ -102,38 +105,55 @@ private[spark] class FairSchedulableBuilder(val
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90548835
--- Diff: core/src/test/scala/org/apache/spark/scheduler/PoolSuite.scala ---
@@ -178,4 +177,36 @@ class PoolSuite extends SparkFunSuite with
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90543989
--- Diff: core/src/test/scala/org/apache/spark/scheduler/PoolSuite.scala ---
@@ -74,30 +79,24 @@ class PoolSuite extends SparkFunSuite with
LocalSparkContext
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90543844
--- Diff: core/src/test/scala/org/apache/spark/scheduler/PoolSuite.scala ---
@@ -20,15 +20,21 @@ package org.apache.spark.scheduler
import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90548446
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SchedulableBuilder.scala ---
@@ -102,38 +105,55 @@ private[spark] class FairSchedulableBuilder(val
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15237#discussion_r90548133
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/SchedulableBuilder.scala ---
@@ -102,38 +105,55 @@ private[spark] class FairSchedulableBuilder(val
Github user squito commented on the issue:
https://github.com/apache/spark/pull/11105
I'm thinking about the binary compatability issue specifically with
`DoubleAccumulator` etc. Whether we intended it or not, `DoubleAccumulator` is
not `final`, so a user could have subclass
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/14079#discussion_r92229579
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/SchedulerIntegrationSuite.scala
---
@@ -157,8 +160,16 @@ abstract class SchedulerIntegrationSuite[T
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/14079#discussion_r92254218
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/BlacklistTracker.scala ---
@@ -17,10 +17,254 @@
package org.apache.spark.scheduler
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/14079#discussion_r92267462
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -678,4 +716,13 @@ private[spark] object TaskSchedulerImpl
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/16270
[SPARK-18846][Scheduler] Fix flakiness in SchedulerIntegrationSuite
## What changes were proposed in this pull request?
There is a small race in SchedulerIntegrationSuite.
The test
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/14079#discussion_r92271406
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/SchedulerIntegrationSuite.scala
---
@@ -157,8 +160,16 @@ abstract class SchedulerIntegrationSuite[T
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/14079#discussion_r92275871
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala ---
@@ -408,6 +411,96 @@ class TaskSchedulerImplSuite extends
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
thanks for the review @kayousterhout. I also added a testcase to
BlacklistTrackerSuite, "task failure timeout works as expected for long-running
tasksets" to cover your point about the lo
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16270#discussion_r92284372
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/SchedulerIntegrationSuite.scala
---
@@ -27,6 +27,8 @@ import scala.language.existentials
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16270
merged to master
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
thanks @kayousterhout ! appreciate all the time you've spent helping out
on this issue.
merged to master
---
If your project is set up for it, you can reply to this email and have
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/16298
[SPARK-8425][Scheduler][HOTFIX] fix scala 2.10 compile error
## What changes were proposed in this pull request?
https://github.com/apache/spark/commit
Github user squito commented on the issue:
https://github.com/apache/spark/pull/14079
oops, thanks for letting me know @zsxwing , I just submitted
https://github.com/apache/spark/pull/16298
---
If your project is set up for it, you can reply to this email and have your
reply appear
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15505#discussion_r93077759
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/local/LocalSchedulerBackend.scala
---
@@ -59,6 +62,12 @@ private[spark] class LocalEndpoint
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15505#discussion_r93078511
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/TaskSchedulerImplSuite.scala ---
@@ -139,29 +139,6 @@ class TaskSchedulerImplSuite extends
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/15505#discussion_r93079839
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskDescription.scala ---
@@ -17,27 +17,179 @@
package org.apache.spark.scheduler
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16053#discussion_r93080712
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskDescription.scala ---
@@ -17,27 +17,119 @@
package org.apache.spark.scheduler
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15505
@witgo @kayousterhout where do we stand on this and
https://github.com/apache/spark/pull/16053? Both still viable alternatives?
https://github.com/apache/spark/pull/16053 is still missing
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/16354
[SPARK-18886][Scheduler][WIP] Adjust Delay scheduling to prevent
under-utilization of cluster
## What changes were proposed in this pull request?
This is a significant change to delay
Github user squito commented on the issue:
https://github.com/apache/spark/pull/16354
@mridulm @markhamstra @kayousterhout
This is *not* ready to merge -- it needs some cleanup and more tests -- but
I thought that seeing an implementation might help think through the design. I
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/16354#discussion_r93292503
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/TaskSetManagerSuite.scala ---
@@ -549,11 +546,15 @@ class TaskSetManagerSuite extends SparkFunSuite
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/16376
[SPARK-18967][SCHEDULER] compute locality levels even if delay = 0
## What changes were proposed in this pull request?
Before this change, with delay scheduling off, spark would effectively
Github user squito commented on the issue:
https://github.com/apache/spark/pull/15505
Looks like @kayousterhout posted some comments addressing my concerns on
https://github.com/apache/spark/pull/16053 at the same time as my last set of
comments. But essentially it sounds like Kay
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6648#issuecomment-155267074
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9264#issuecomment-155438549
yeah its confusing to go through the mima msgs -- the one log line you
quoted is for spark-mllib. the issues are in spark-core:
```
[error] * synthetic
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6648#issuecomment-155439140
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9559#discussion_r44435230
--- Diff: core/src/main/scala/org/apache/spark/scheduler/MapStatus.scala ---
@@ -193,6 +254,12 @@ private[spark] object HighlyCompressedMapStatus
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9559#issuecomment-155495152
Sean has raised some important higher level questions on the jira -- I'd
like us to resolve the discussion there before moving forward on this.
---
If your proje
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6648#issuecomment-155514893
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44496929
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
@@ -93,6 +95,10 @@ private[spark] class IndexShuffleBlockResolver
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44497139
--- Diff:
core/src/main/java/org/apache/spark/shuffle/sort/BypassMergeSortShuffleWriter.java
---
@@ -155,9 +156,20 @@ public void write(Iterator> reco
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44497573
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleWriter.scala ---
@@ -106,6 +108,19 @@ private[spark] class HashShuffleWriter[K, V
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44497624
--- Diff:
core/src/main/java/org/apache/spark/shuffle/sort/BypassMergeSortShuffleWriter.java
---
@@ -155,9 +156,20 @@ public void write(Iterator> reco
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9620#issuecomment-156176930
sorry Josh, thanks for taking care of it
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44715168
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
@@ -93,6 +93,29 @@ private[spark] class IndexShuffleBlockResolver
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44715331
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
@@ -93,6 +93,29 @@ private[spark] class IndexShuffleBlockResolver
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9610#discussion_r44717955
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleWriter.scala ---
@@ -106,6 +108,19 @@ private[spark] class HashShuffleWriter[K, V
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9610#issuecomment-156244415
you can take this test case if you like:
https://github.com/squito/spark/blob/SPARK-8029_first_wins/core/src/test/scala/org/apache/spark/ShuffleSuite.scala#L351
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9661#issuecomment-156258591
@srowen There is still some analysis left to be done, but I think there is
a growing case that roaring is actually the right way to go, with some minor
tweaks. So I
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44725340
--- Diff: core/pom.xml ---
@@ -174,6 +174,11 @@
lz4
+ org.roaringbitmap
+ RoaringBitmap
+ 0.4.5
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44725556
--- Diff:
core/src/main/scala/org/apache/spark/serializer/KryoSerializer.scala ---
@@ -362,6 +364,12 @@ private[serializer] object KryoSerializer
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44726575
--- Diff: core/src/main/scala/org/apache/spark/scheduler/MapStatus.scala ---
@@ -176,15 +179,17 @@ private[spark] object HighlyCompressedMapStatus
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44726771
--- Diff: core/src/main/scala/org/apache/spark/scheduler/MapStatus.scala ---
@@ -193,6 +198,11 @@ private[spark] object HighlyCompressedMapStatus
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44727149
--- Diff: core/src/main/scala/org/apache/spark/scheduler/MapStatus.scala ---
@@ -154,15 +155,17 @@ private[spark] class HighlyCompressedMapStatus
private
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9661#discussion_r44727600
--- Diff: core/src/main/scala/org/apache/spark/scheduler/MapStatus.scala ---
@@ -176,15 +179,17 @@ private[spark] object HighlyCompressedMapStatus
Github user squito closed the pull request at:
https://github.com/apache/spark/pull/6648
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9661#issuecomment-156491055
@yaooqinn thanks for the additional analysis, but I would really prefer
this is written up and attached to the jira for reference -- extended
discussions in the PR are
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6291#issuecomment-156571133
lgtm pending tests!
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6263#discussion_r44960862
--- Diff: core/src/main/scala/org/apache/spark/ui/exec/ExecutorsPage.scala
---
@@ -25,6 +25,7 @@ import scala.xml.Node
import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6263#discussion_r44961121
--- Diff:
core/src/main/scala/org/apache/spark/storage/StorageStatusListener.scala ---
@@ -28,15 +35,34 @@ import org.apache.spark.scheduler
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6263#discussion_r44961686
--- Diff:
core/src/main/scala/org/apache/spark/storage/StorageStatusListener.scala ---
@@ -87,6 +113,8 @@ class StorageStatusListener extends SparkListener
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6263#issuecomment-157128907
@archit279thakur yeah don't worry about that test failure ... when you push
updates & bring up to date w/ master the tests will re-run in any case
---
If you
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/9816#discussion_r45647600
--- Diff: core/src/main/scala/org/apache/spark/Logging.scala ---
@@ -119,30 +119,31 @@ trait Logging {
val usingLog4j12
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/9816#issuecomment-159064395
lgtm, just a tiny comment
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the issue:
https://github.com/apache/spark/pull/19893
> The last suspicious big group of threads (at least for me) is
broadcast-exchange.* but as I've seen this is not false positive because the
threadpool never stopped. In BroadcastExchangeEx
Github user squito commented on the issue:
https://github.com/apache/spark/pull/19893
ok I just took a look at BroadcastExchangeExec, I see what you mean. It
isn't *that* bad, since spark isn't continually creating more instances of
those threads (you're not suppos
Github user squito commented on the issue:
https://github.com/apache/spark/pull/19893
lgtm
@jiangxb1987 are you still looking at this?
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For
Github user squito closed the pull request at:
https://github.com/apache/spark/pull/19250
---
-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org
401 - 500 of 3535 matches
Mail list logo