I'm trying to write a deployment job for Spark application. Basically the
job will send yarn application --kill app_id to the cluster but after the
application received the signal it dies without finishing whatever is
processing or stopping the stream.

I'm using Spark Streaming. What's the best way to stop Spark application so
we won't lose any data.

Reply via email to