Github user lins05 commented on a diff in the pull request: https://github.com/apache/spark/pull/16189#discussion_r91856289 --- Diff: core/src/main/scala/org/apache/spark/executor/Executor.scala --- @@ -432,6 +458,78 @@ private[spark] class Executor( } /** + * Supervises the killing / cancellation of a task by sending the interrupted flag, optionally + * sending a Thread.interrupt(), and monitoring the task until it finishes. + */ + private class TaskReaper( + taskRunner: TaskRunner, + val interruptThread: Boolean) + extends Runnable { + + private[this] val taskId: Long = taskRunner.taskId + + private[this] val killPollingFrequencyMs: Long = + conf.getTimeAsMs("spark.task.killPollingFrequency", "10s") + + private[this] val killTimeoutMs: Long = conf.getTimeAsMs("spark.task.killTimeout", "2m") + + private[this] val takeThreadDump: Boolean = + conf.getBoolean("spark.task.threadDumpKilledTasks", true) + + override def run(): Unit = { + val startTimeMs = System.currentTimeMillis() + def elapsedTimeMs = System.currentTimeMillis() - startTimeMs + try { + while (!taskRunner.isFinished && (elapsedTimeMs < killTimeoutMs || killTimeoutMs <= 0)) { + taskRunner.kill(interruptThread = interruptThread) --- End diff -- > In the case where we do interrupt, however, the introduction of this polling loop means that we'll interrupt the same task multiple times My 2c: if the application code doesn't respond to the first interrupt immediately, chances are very low that it would respond to the following interrupts (it may got stuck in some dead loop, or some blocking JNI call), so sending multiple interrupt may not be necessary.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org