On Wed, Apr 6, 2016 at 4:39 PM, Sung Hwan Chung <coded...@cs.stanford.edu>
wrote:

> My option so far seems to be using JVM's shutdown hook, but I was
> wondering if Spark itself had an API for tasks.
>

Spark would be using that under the hood anyway, so you might as well just
use the jvm shutdown hook directly.

Reply via email to