What I meant is 'application'. I.e., when we manually terminate an
application that was submitted via spark-submit.
When we manually kill an application, it seems that individual tasks do not
receive the interruptException.

That interruptException seems to work iff we cancel the job through
sc.cancellJob or cancelAllJobs while the application is still alive.

My option so far seems to be using JVM's shutdown hook, but I was wondering
if Spark itself had an API for tasks.

On Wed, Apr 6, 2016 at 7:36 PM, Mark Hamstra <m...@clearstorydata.com>
wrote:

> Why would the Executors shutdown when the Job is terminated?  Executors
> are bound to Applications, not Jobs.  Furthermore,
> unless spark.job.interruptOnCancel is set to true, canceling the Job at the
> Application and DAGScheduler level won't actually interrupt the Tasks
> running on the Executors.  If you do have interruptOnCancel set, then you
> can catch the interrupt exception within the Task.
>
> On Wed, Apr 6, 2016 at 12:24 PM, Sung Hwan Chung <coded...@gmail.com>
> wrote:
>
>> Hi,
>>
>> I'm looking for ways to add shutdown hooks to executors : i.e., when a
>> Job is forcefully terminated before it finishes.
>>
>> The scenario goes likes this : executors are running a long running job
>> within a 'map' function. The user decides to terminate the job, then the
>> mappers should perform some cleanups before going offline.
>>
>> What would be the best way to do this?
>>
>
>

Reply via email to