Hi,

I'm a newbie to spark, starting to work with Spark 1.5 using the Java API 
(about to upgrade to 1.6 soon).
I am deploying a spark streaming application using spark-submit with 
yarn-cluster mode.
What is the recommended way for performing graceful shutdown to the spark job?

Already tried using the spark.streaming.stopGracefullyOnShutdown configuration, 
adding a shutdown hook method, implementing the onStop() method.
I see at the logs that the hook methods are called and the configuration is 
read, but the application is terminated immediately, and also it doesn't clean 
the spark staging directory on HDFS.

Thanks,
Guy


This message and the information contained herein is proprietary and 
confidential and subject to the Amdocs policy statement,
you may review at http://www.amdocs.com/email_disclaimer.asp

Reply via email to