You can add a shutdown hook to your JVM and request spark streaming context
to stop gracefully.
/**
* Shutdown hook to shutdown JVM gracefully
* @param ssCtx
*/
def addShutdownHook(ssCtx: StreamingContext) = {
Runtime.getRuntime.addShutdownHook( new Thread() {
override
I'm trying to write a deployment job for Spark application. Basically the
job will send yarn application --kill app_id to the cluster but after the
application received the signal it dies without finishing whatever is
processing or stopping the stream.
I'm using Spark Streaming. What's the best