As part of PR https://github.com/apache/spark/pull/11723, I have added a
killAllTasks function that can be used to kill (rather interrupt)
individual tasks before an executor exits. If this PR is accepted, for
doing task level cleanups, we can add a call to this function before
executor exits. The exit thread will wait for a certain period of time
before the executor jvm exits to allow proper cleanups of the tasks.

Hemant Bhanawat <https://www.linkedin.com/in/hemant-bhanawat-92a3811>
www.snappydata.io

On Thu, Apr 7, 2016 at 6:08 AM, Reynold Xin <r...@databricks.com> wrote:

>
> On Wed, Apr 6, 2016 at 4:39 PM, Sung Hwan Chung <coded...@cs.stanford.edu>
> wrote:
>
>> My option so far seems to be using JVM's shutdown hook, but I was
>> wondering if Spark itself had an API for tasks.
>>
>
> Spark would be using that under the hood anyway, so you might as well just
> use the jvm shutdown hook directly.
>
>

Reply via email to