GitHub user jerryshao opened a pull request: https://github.com/apache/spark/pull/6870
[SPARK-8425][Core][WIP] Add blacklist mechanism in task scheduling This is a proposal to add blacklist mechanism in Spark for better scheduling task to avoid running on the bad enough executors, this blacklist mechanism is based on the finished status of task with heuristic algorithm to track the bad enough executors to add to blacklist, the idea is brought from MR. The reason and detailed design doc please refer to [SPARK-8424](https://issues.apache.org/jira/browse/SPARK-8424). You can merge this pull request into a Git repository by running: $ git pull https://github.com/jerryshao/apache-spark SPARK-8425 Alternatively you can review and apply these changes as the patch at: https://github.com/apache/spark/pull/6870.patch To close this pull request, make a commit to your master/trunk branch with (at least) the following in the commit message: This closes #6870 ---- commit 6df4a9f5f09dfa2f600a2bbf82c58589b16e81f9 Author: jerryshao <saisai.s...@intel.com> Date: 2015-05-15T07:17:05Z blacklist mechanism initial commit commit 031cb196ff4b9d5db11856a2724e760713682a6a Author: jerryshao <saisai.s...@intel.com> Date: 2015-05-19T09:33:21Z Enable blacklist in Yarn commit 9e613557cf4bd89fabbb8ad4c2426119c588fae1 Author: jerryshao <saisai.s...@intel.com> Date: 2015-06-12T01:21:51Z Continue working on this commit b8066f3788fb6458eff8d5afcde266ab77d22fb1 Author: jerryshao <saisai.s...@intel.com> Date: 2015-06-18T01:51:43Z code refactor and add unit test ---- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org