Why would the Executors shutdown when the Job is terminated?  Executors are
bound to Applications, not Jobs.  Furthermore,
unless spark.job.interruptOnCancel is set to true, canceling the Job at the
Application and DAGScheduler level won't actually interrupt the Tasks
running on the Executors.  If you do have interruptOnCancel set, then you
can catch the interrupt exception within the Task.

On Wed, Apr 6, 2016 at 12:24 PM, Sung Hwan Chung <coded...@gmail.com> wrote:

> Hi,
>
> I'm looking for ways to add shutdown hooks to executors : i.e., when a Job
> is forcefully terminated before it finishes.
>
> The scenario goes likes this : executors are running a long running job
> within a 'map' function. The user decides to terminate the job, then the
> mappers should perform some cleanups before going offline.
>
> What would be the best way to do this?
>

Reply via email to