[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-28 Thread devaraj-kavali
Github user devaraj-kavali closed the pull request at:

https://github.com/apache/spark/pull/11916


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-28 Thread devaraj-kavali
Github user devaraj-kavali commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-202318900
  
I have moved these changes to the PR 
https://github.com/apache/spark/pull/11996 for SPARK-10530. @tgravescs, please 
have a look into https://github.com/apache/spark/pull/11996 when you have some 
time. Thanks


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-25 Thread devaraj-kavali
Github user devaraj-kavali commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57459040
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

Thanks @tgravescs.

I would be happy to fix the issue about succeeding more than one attempt as 
you explained as part of this PR but I am thinking it would be good if we can 
handle it separately without mixing with the current PR changes.

I will move the current changes to a PR for 
[SPARK-10530](https://issues.apache.org/jira/browse/SPARK-10530) and we can 
continue to fix multiple attempts success issue as part of the 
[SPARK-13343](https://issues.apache.org/jira/browse/SPARK-13343). 
Please let me if it doesn't make sense.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57369488
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

so I'll try the patch out but I'm pretty sure it will still show multiple 
succeeded tasks that were speculative.

in  SparkHadoopMapRedUtil.commitTask
it has the check:
if (committer.needsTaskCommit(mrTaskContext)) {
...
   } else {
   // Some other attempt committed the output, so we do nothing and signal 
success
  logInfo(s"No need to commit output of task because 
needsTaskCommit=false: $mrTaskAttemptID")
}

So if another task commits, and then the second speculative task tries to 
commit, its simply going to log this message and send the task finished event 
back to driver.  Driver is going to take that as success.

If your intention is just to solve the issue with killing tasks perhaps 
move this PR to be for https://issues.apache.org/jira/browse/SPARK-10530,  and 
leave SPARK-13343 open.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread devaraj-kavali
Github user devaraj-kavali commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57349258
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

I can think that during the map phase(which don't write to Hadoop) there is 
a chance of succeeding two attempts as you explained. But in final phase(which 
write to Hadoop) tasks, during commitTask() if two attempts try to rename 
taskAttemptPath to committedTaskPath then only one attempt would succeed and 
other will fail with the rename failure.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57342982
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

Correct the SparkHadoopMapRedUtil.commitTask prevent it from actually 
committing,  but if that task completion event gets sent back to the driver 
before your code above sends kill won't it still be marked as success?  Maybe 
I'm wrong but I seem to remember seeing that happen.  If the commitTask 
happened at the right time you would see the CommitDenied, but if it happened 
later it just comes back with success and I don't see how this code helps that.

I agree the second issue, there is actually another jira for it also: 
https://issues.apache.org/jira/browse/SPARK-10530, but if both are solved by 
same thing we can just dup them.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread devaraj-kavali
Github user devaraj-kavali commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57340394
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

Thanks @tgravescs for the comment.

If anyone attempt is actually completed(succeeded) and not reached the 
success event here and during that time if any other attempt tries to commit 
the o/p then the SparkHadoopMapRedUtil.commitTask would prevent it doing so. 
And other case is that if the task attempt completes in Executor before getting 
the kill signal from TaskSetManager.handleSuccessfulTask then the Executor 
ignores the kill request and there will be no problem. I don't see a case that 
there will be two attempts becoming success where the task attempts use the 
commit coordination, Please help me understand if there are any. 

Here the major issue is, there are other task attempts running and not 
releasing the executor threads even if there is a task attempt already 
succeeded for the same task, sometimes these unnecessary task attempts keep 
running till the job/application completion(if the worker nodes running these 
attempts are very slow) which makes the application performance worse.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread tgravescs
Github user tgravescs commented on a diff in the pull request:

https://github.com/apache/spark/pull/11916#discussion_r57316516
  
--- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -620,6 +620,14 @@ private[spark] class TaskSetManager(
 // Note: "result.value()" only deserializes the value when it's called 
at the first time, so
 // here "result.value()" just returns the value and won't block other 
threads.
 sched.dagScheduler.taskEnded(tasks(index), Success, result.value(), 
result.accumUpdates, info)
+// Kill other task attempts if any as the one attempt succeeded
+for (attemptInfo <- taskAttempts(index) if attemptInfo.attemptNumber 
!= info.attemptNumber
--- End diff --

I've only taken a quick look at this pr but I think there is a race here 
that if the speculative task finishes before this kill event can actually go 
out it will still be marked as success.  Please correct me if I'm wrong.

SparkHadoopMapRedUtil.commitTask can just log if something is already 
committed, (sometimes it gets CommitDeniedException, depending on timing).
At one point I had played with changing that to commit denied but other issues 
happened and I didn't have time to finish investigating.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread devaraj-kavali
Github user devaraj-kavali commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200740334
  
Thanks @rxin and @andrewor14 for looking into the patch.

These failed tests in the latest build are not related to this patch and 
they have been failing in the previous builds as well.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200738785
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54016/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200738781
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-24 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200738616
  
**[Test build #54016 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54016/consoleFull)**
 for PR 11916 at commit 
[`7bbe734`](https://github.com/apache/spark/commit/7bbe73434801991083e81a416318d21c72f44b13).
 * This patch **fails Spark unit tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200698229
  
**[Test build #54016 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54016/consoleFull)**
 for PR 11916 at commit 
[`7bbe734`](https://github.com/apache/spark/commit/7bbe73434801991083e81a416318d21c72f44b13).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200690718
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200690724
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/54009/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200690703
  
**[Test build #54009 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54009/consoleFull)**
 for PR 11916 at commit 
[`53023dd`](https://github.com/apache/spark/commit/53023dd47a2c3dec5320c97a6c3870a39a595a60).
 * This patch **fails Scala style tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200690203
  
**[Test build #54009 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/54009/consoleFull)**
 for PR 11916 at commit 
[`53023dd`](https://github.com/apache/spark/commit/53023dd47a2c3dec5320c97a6c3870a39a595a60).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200679647
  
Test FAILed.
Refer to this link for build results (access rights to CI server needed): 
https://amplab.cs.berkeley.edu/jenkins//job/SparkPullRequestBuilder/53998/
Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200679645
  
Merged build finished. Test FAILed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200679249
  
**[Test build #53998 has 
finished](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53998/consoleFull)**
 for PR 11916 at commit 
[`7c033b6`](https://github.com/apache/spark/commit/7c033b6d6dd7eb1d9296d82a965facec95dd6757).
 * This patch **fails Spark unit tests**.
 * This patch merges cleanly.
 * This patch adds no public classes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread SparkQA
Github user SparkQA commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200643122
  
**[Test build #53998 has 
started](https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/53998/consoleFull)**
 for PR 11916 at commit 
[`7c033b6`](https://github.com/apache/spark/commit/7c033b6d6dd7eb1d9296d82a965facec95dd6757).


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread andrewor14
Github user andrewor14 commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200641412
  
ok to test


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread rxin
Github user rxin commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200607117
  
cc @andrewor14 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread AmplabJenkins
Github user AmplabJenkins commented on the pull request:

https://github.com/apache/spark/pull/11916#issuecomment-200325537
  
Can one of the admins verify this patch?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-13343] [CORE] speculative tasks that di...

2016-03-23 Thread devaraj-kavali
GitHub user devaraj-kavali opened a pull request:

https://github.com/apache/spark/pull/11916

[SPARK-13343] [CORE] speculative tasks that didn't commit shouldn't be 
marked as success

## What changes were proposed in this pull request?

Now with this patch, killed tasks will not be considered as failed tasks 
and they get listed separately in the UI and also shows the task state as 
KILLED instead of FAILED.

## How was this patch tested?

I have verified this patch manually, when any attempt gets killed then they 
are considered as KILLED tasks and not considered as FAILED tasks. Please find 
the attached screen shots for the reference. 
[SPARK-13965](https://issues.apache.org/jira/browse/SPARK-13965)/https://github.com/apache/spark/pull/11778
 kills the running task attempts immediately when any one of the task succeed 
and this patch will show consider and show them as KILLED.

![stage-tasks-table](https://cloud.githubusercontent.com/assets/3174804/13984882/1e8deb66-f11f-11e5-9a89-e571dc5f1eef.png)

![stages-table](https://cloud.githubusercontent.com/assets/3174804/13984881/1e8d8216-f11f-11e5-9d29-22a7aca94938.png)


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/devaraj-kavali/spark SPARK-13343

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/11916.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #11916


commit 7c033b6d6dd7eb1d9296d82a965facec95dd6757
Author: Devaraj K 
Date:   2016-03-23T12:11:30Z

[SPARK-13343] [CORE] speculative tasks that didn't commit shouldn't be
marked as success




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org