Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7014#discussion_r33468301
--- Diff: core/src/main/scala/org/apache/spark/TaskEndReason.scala ---
@@ -97,11 +101,17 @@ case class ExceptionFailure(
description: String
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490036
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -59,6 +59,9 @@ private[spark] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490121
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -191,13 +194,29 @@ private[spark] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490259
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
@@ -86,10 +89,106 @@ private[mesos] trait
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490252
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
@@ -86,10 +89,106 @@ private[mesos] trait
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490244
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
@@ -86,10 +89,106 @@ private[mesos] trait
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33490235
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala
---
@@ -17,16 +17,19 @@
package
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7227#issuecomment-118635263
Jenkins, ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7227#issuecomment-118635287
lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6750#issuecomment-118214549
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-118214495
merged to master.
Thanks for all your hard work on this @BryanCutler! this was probably a
lot more work than you initially expected (and me too), but I'm glad
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-118078567
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6990#discussion_r33794783
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -833,8 +833,10 @@ private[spark] class BlockManager(
logDebug(Put
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6870#discussion_r33792632
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ExecutorBlacklistTracker.scala
---
@@ -0,0 +1,175 @@
+/*
+ * Licensed to the Apache Software
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-118078540
@BryanCutler yeah making it serializable seems fine. I have a feeling that
there is probably some field that could be marked `transient` but that is not a
big deal
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r33797734
--- Diff: core/src/main/scala/org/apache/spark/util/AkkaUtils.scala ---
@@ -147,7 +146,7 @@ private[spark] object AkkaUtils extends Logging {
def
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r33798162
--- Diff: core/src/test/scala/org/apache/spark/rpc/RpcEnvSuite.scala ---
@@ -539,6 +544,97 @@ abstract class RpcEnvSuite extends SparkFunSuite
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4055#issuecomment-118944562
@suyanNone I agree that right now, that is the way tasks attempts are
tracked through the various `Task*` classes. But, my concern is that it could
easily lead to very
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4055#discussion_r33965943
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1193,8 +1193,10 @@ class DAGScheduler(
// TODO
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4055#discussion_r33970069
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -598,6 +598,88 @@ class DAGSchedulerSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/4055#discussion_r33967596
--- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala ---
@@ -305,7 +305,7 @@ private[spark] class MapOutputTrackerMaster(conf:
SparkConf
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7014#issuecomment-119052339
ping @pwendell
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6990#discussion_r33429725
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -833,8 +833,10 @@ private[spark] class BlockManager(
logDebug(Put
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5563#discussion_r33495482
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -59,6 +59,10 @@ private[spark] class
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/4055#issuecomment-116388201
Hi @suyanNone any updates? I think this is a really important fix and you
are getting close. You've done a lot of work describing and exploring the
problem, it would
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6935#issuecomment-116390029
@steveloughran do you think you can pick this up with the test case? I
think we have a little better understanding of the problem. I do think your
approach
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6870#discussion_r33731682
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ExecutorBlacklistTracker.scala
---
@@ -0,0 +1,175 @@
+/*
+ * Licensed to the Apache Software
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6870#issuecomment-117841586
Hi @jerryshao thanks for working on this. I did have one larger design
question, about what to do when there are too many black listed executors ...
though I'm not sure
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6870#discussion_r33730280
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/ExecutorBlacklistTracker.scala
---
@@ -0,0 +1,175 @@
+/*
+ * Licensed to the Apache Software
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6870#discussion_r33730300
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/ExecutorBlacklistTrackerSuite.scala
---
@@ -0,0 +1,158 @@
+/*
+ * Licensed to the Apache
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6968#discussion_r33740255
--- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
@@ -194,7 +194,7 @@ abstract class RDD[T: ClassTag](
@transient private var partitions_
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6990#discussion_r33741094
--- Diff: core/src/main/scala/org/apache/spark/storage/BlockManager.scala
---
@@ -833,8 +833,10 @@ private[spark] class BlockManager(
logDebug(Put
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6750#issuecomment-117866359
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6750#discussion_r33826853
--- Diff: core/src/main/scala/org/apache/spark/TaskContextImpl.scala ---
@@ -30,6 +30,7 @@ private[spark] class TaskContextImpl(
override val
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6750#discussion_r33827287
--- Diff: core/src/main/scala/org/apache/spark/SparkException.scala ---
@@ -30,3 +30,13 @@ class SparkException(message: String, cause: Throwable
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6394#discussion_r32243556
--- Diff:
yarn/src/main/scala/org/apache/spark/deploy/yarn/YarnAllocator.scala ---
@@ -225,12 +240,89 @@ private[yarn] class YarnAllocator(
logInfo
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6394#discussion_r32243839
--- Diff:
yarn/src/test/scala/org/apache/spark/deploy/yarn/YarnAllocatorSuite.scala ---
@@ -242,4 +245,154 @@ class YarnAllocatorSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6423#discussion_r32256571
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleReader.scala ---
@@ -33,23 +34,55 @@ private[spark] class HashShuffleReader[K, C
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6423#issuecomment-111246687
lgtm (the discussion I just started about types of iterators shouldn't
effect whether or not this can be merged)
---
If your project is set up for it, you can reply
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32258055
--- Diff:
core/src/test/scala/org/apache/spark/rpc/akka/AkkaRpcEnvSuite.scala ---
@@ -47,4 +56,60 @@ class AkkaRpcEnvSuite extends RpcEnvSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32258616
--- Diff: core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala ---
@@ -18,8 +18,11 @@
package org.apache.spark.rpc
import java.net.URI
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32258932
--- Diff: core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala ---
@@ -182,3 +185,117 @@ private[spark] object RpcAddress {
RpcAddress(host, port
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259066
--- Diff: core/src/main/scala/org/apache/spark/rpc/RpcEnv.scala ---
@@ -182,3 +185,117 @@ private[spark] object RpcAddress {
RpcAddress(host, port
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259238
--- Diff: core/src/main/scala/org/apache/spark/rpc/akka/AkkaRpcEnv.scala ---
@@ -295,19 +297,20 @@ private[akka] class AkkaRpcEndpointRef(
actorRef
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259352
--- Diff: core/src/main/scala/org/apache/spark/util/AkkaUtils.scala ---
@@ -17,9 +17,9 @@
package org.apache.spark.util
+import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259388
--- Diff: core/src/main/scala/org/apache/spark/util/RpcUtils.scala ---
@@ -17,11 +17,11 @@
package org.apache.spark.util
-import
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259733
--- Diff: core/src/main/scala/org/apache/spark/util/RpcUtils.scala ---
@@ -47,14 +47,14 @@ object RpcUtils {
}
/** Returns the default
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259739
--- Diff: core/src/main/scala/org/apache/spark/util/RpcUtils.scala ---
@@ -47,14 +47,14 @@ object RpcUtils {
}
/** Returns the default
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32259961
--- Diff: core/src/test/scala/org/apache/spark/rpc/RpcEnvSuite.scala ---
@@ -162,9 +163,15 @@ abstract class RpcEnvSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32260319
--- Diff: core/src/test/scala/org/apache/spark/rpc/RpcEnvSuite.scala ---
@@ -539,6 +546,30 @@ abstract class RpcEnvSuite extends SparkFunSuite
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32260411
--- Diff:
core/src/test/scala/org/apache/spark/rpc/akka/AkkaRpcEnvSuite.scala ---
@@ -17,9 +17,18 @@
package org.apache.spark.rpc.akka
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32260635
--- Diff:
core/src/test/scala/org/apache/spark/rpc/akka/AkkaRpcEnvSuite.scala ---
@@ -47,4 +56,60 @@ class AkkaRpcEnvSuite extends RpcEnvSuite
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-111258945
Jenkins, ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6205#discussion_r32261768
--- Diff:
core/src/test/scala/org/apache/spark/rpc/akka/AkkaRpcEnvSuite.scala ---
@@ -47,4 +56,60 @@ class AkkaRpcEnvSuite extends RpcEnvSuite
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-111261738
Jenkins, test this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6377#issuecomment-105041178
Hi @scwf thanks for finding this. So I dug into this a little bit -- looks
like this is just a little bit of ignorance on my part when creating the test.
I used CST
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5572#issuecomment-105257051
@viirya @cloud-fan good point, I hadn't thought about multiple tasks on one
executor that are all pulling the same partition of `rdd2`. Still, I'm very
worried about
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6377#discussion_r30982314
--- Diff:
core/src/test/scala/org/apache/spark/status/api/v1/SimpleDateParamSuite.scala
---
@@ -18,12 +18,13 @@ package org.apache.spark.status.api.v1
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5572#discussion_r30985762
--- Diff: core/src/main/scala/org/apache/spark/CacheManager.scala ---
@@ -48,7 +49,13 @@ private[spark] class CacheManager(blockManager:
BlockManager) extends
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5572#issuecomment-105252687
why do you think that just idea 1 will not result in a speedup? It will
also have the desired effect of only fetching the remote blocks once. So I
think
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5572#issuecomment-105269674
@viirya:
con: newly cached blocks, effectively adding some replication, but using
extra memory. Users will
most likely find that extra caching quite
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6362#issuecomment-104764030
lgtm, just waiting for tests (yeah I know its probably unnecessary in this
case ...)
---
If your project is set up for it, you can reply to this email and have your
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5572#issuecomment-104784328
Shouldn't `CartesianRDD` be changed so that it calls `rdd1.iterator(...,
cacheRemote=true)` (same for `rdd2`)? Or does that happen somewhere and I'm
missing
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6291#issuecomment-104785997
can you add a test case? Seems especially important given that a previous
fix didn't actually catch the bug. I don't understand what's going on well
enough to know
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6362#issuecomment-104763940
ok to test
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6362#issuecomment-104765052
actually, do you have any interest in adding a test case to
`MetricsSystemSuite` to add a test case for sink configuration? It doesn't
look like there is anything
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5964#discussion_r30862115
--- Diff:
core/src/main/scala/org/apache/spark/shuffle/IndexShuffleBlockResolver.scala ---
@@ -74,12 +74,16 @@ private[spark] class IndexShuffleBlockResolver
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5964#discussion_r30862163
--- Diff:
core/src/test/scala/org/apache/spark/scheduler/DAGSchedulerSuite.scala ---
@@ -545,9 +546,21 @@ class DAGSchedulerSuite
taskSet.tasks(1
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5964#discussion_r30862004
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -1031,15 +1036,31 @@ class DAGScheduler(
case smt
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5964#discussion_r30861958
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -830,6 +830,10 @@ class DAGScheduler(
logDebug(submitMissingTasks
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5964#issuecomment-104457784
Would love to get some feedback from the scheduler maintainers: @mateiz
@markhamstra @kayousterhout @pwendell
---
If your project is set up for it, you can reply
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/5964#discussion_r30862083
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
@@ -159,6 +159,12 @@ private[spark] class TaskSchedulerImpl
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6166#discussion_r30837452
--- Diff:
yarn/src/main/scala/org/apache/spark/scheduler/cluster/YarnClusterSchedulerBackend.scala
---
@@ -53,4 +62,65 @@ private[spark] class
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6205#issuecomment-104395631
@hardmettle I don't think there are any occurrences of `Await.ready` that
are relevant here -- where they do occur, they aren't tied to a configuration
variable
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5964#issuecomment-104663941
all that said, I do see the value in separating them out as well, so I'll
do that in addition. But I'd still like reviewers to consider this
holistically
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5964#issuecomment-104660767
@kayousterhout I debated doing that, but I kept them together b/c my test
case produces all four. So I wouldn't have passing tests unless I addressed
them all. (Though
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6284#issuecomment-104346714
retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6291#issuecomment-105622772
@markhamstra oh I'm not saying that your change is bad or questionable at
all. But I am wondering, what actually went wrong before this change? Are we
sure this change
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6291#issuecomment-105644015
thanks mark. If I understand correctly, the earlier PR did fix the
`NoSuchElementException` as reported in the jira. So its not like you can
write a test case which
GitHub user squito opened a pull request:
https://github.com/apache/spark/pull/6420
[SPARK-7829][CORE][WIP] multiple attempts of shuffle tasks is OK
fix for one of the retry issues -- data files are appended to, but the
index file always points to the beginning of the file
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/6377#discussion_r31033487
--- Diff:
core/src/test/scala/org/apache/spark/status/api/v1/SimpleDateParamSuite.scala
---
@@ -18,12 +18,12 @@ package org.apache.spark.status.api.v1
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/6750#issuecomment-119279396
@kayousterhout I think I addressed your comments. I had some questions
about the best way to make the attemptId condition cleaner -- I went ahead with
one approach here
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7028#issuecomment-119291909
Hi @aarondav , sorry I had missed the part about breaking the api. It
looks to me like the only place `JobWaiter.jobFailed` is called with a
non-SparkException is when
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7770#issuecomment-127114536
Whats the reason for using accumulators instead of TaskMetrics?
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7770#discussion_r36054177
--- Diff: core/src/main/scala/org/apache/spark/Aggregator.scala ---
@@ -89,13 +84,18 @@ case class Aggregator[K, V, C] (
} else {
val
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7770#discussion_r36055045
--- Diff: core/src/main/scala/org/apache/spark/scheduler/DAGScheduler.scala
---
@@ -773,16 +773,26 @@ class DAGScheduler(
stage.pendingTasks.clear
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7770#discussion_r36055191
--- Diff: core/src/main/scala/org/apache/spark/ui/ToolTips.scala ---
@@ -62,6 +62,13 @@ private[spark] object ToolTips {
Time that the executor spent
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7014#issuecomment-130336611
merged to master 1.5 thanks @tomwhite !
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/5400#issuecomment-130331352
Hey @tgravescs , sorry for the late response. Absolutely, it would be
great to get you to start testing this. I'm a bit overloaded at the moment but
probably next week
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7739#discussion_r36820995
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerBackend.scala
---
@@ -88,7 +88,10 @@ private[spark] class
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7943#discussion_r36873941
--- Diff:
network/shuffle/src/main/java/org/apache/spark/network/shuffle/ExternalShuffleBlockResolver.java
---
@@ -252,4 +338,118 @@ public String toString
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7839#issuecomment-128431151
I'm closing this in favor of the leveldb-based approach
https://github.com/apache/spark/pull/7943 -- but I'll make the relevant
suggestions from here
---
If your
Github user squito closed the pull request at:
https://github.com/apache/spark/pull/7839
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/7839#discussion_r36436044
--- Diff:
network/yarn/src/main/java/org/apache/spark/network/yarn/YarnShuffleService.java
---
@@ -100,11 +121,34 @@ private boolean isAuthenticationEnabled
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7975#issuecomment-128432795
lgtm
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7910#issuecomment-128433323
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/8090#issuecomment-130492201
I was thinking that my points about simplifying the logic here and making
this safer would be pretty clear, but I guess that is not the case. If that is
at all
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/8153#issuecomment-130510800
minor style comment, otherwise makes sense to me
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your
Github user squito commented on a diff in the pull request:
https://github.com/apache/spark/pull/8153#discussion_r36936462
--- Diff:
core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala ---
@@ -204,13 +204,21 @@ private[history] class FsHistoryProvider(conf
Github user squito commented on the pull request:
https://github.com/apache/spark/pull/7943#issuecomment-132614143
Jenkins, retest this please
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have
601 - 700 of 3519 matches
Mail list logo