As part of PR https://github.com/apache/spark/pull/11723, I have added a
killAllTasks function that can be used to kill (rather interrupt)
individual tasks before an executor exits. If this PR is accepted, for
doing task level cleanups, we can add a call to this function before
executor exits. The
On Wed, Apr 6, 2016 at 4:39 PM, Sung Hwan Chung
wrote:
> My option so far seems to be using JVM's shutdown hook, but I was
> wondering if Spark itself had an API for tasks.
>
Spark would be using that under the hood anyway, so you might as well just
use the jvm
What I meant is 'application'. I.e., when we manually terminate an
application that was submitted via spark-submit.
When we manually kill an application, it seems that individual tasks do not
receive the interruptException.
That interruptException seems to work iff we cancel the job through
Why would the Executors shutdown when the Job is terminated? Executors are
bound to Applications, not Jobs. Furthermore,
unless spark.job.interruptOnCancel is set to true, canceling the Job at the
Application and DAGScheduler level won't actually interrupt the Tasks
running on the Executors. If
Hi,
I'm looking for ways to add shutdown hooks to executors : i.e., when a Job
is forcefully terminated before it finishes.
The scenario goes likes this : executors are running a long running job
within a 'map' function. The user decides to terminate the job, then the
mappers should perform some