Many Thanks Silvio,

Someone also suggested using something similar :

./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver
ID>

Regards
jk


On Fri, May 8, 2015 at 2:12 AM, Silvio Fiorito <
silvio.fior...@granturing.com> wrote:

>   Hi James,
>
>  If you’re on Spark 1.3 you can use the kill command in spark-submit to
> shut it down. You’ll need the driver id from the Spark UI or from when you
> submitted the app.
>
>  spark-submit --master spark://master:7077 --kill <driver-id>
>
>  Thanks,
> Silvio
>
>   From: James King
> Date: Wednesday, May 6, 2015 at 12:02 PM
> To: user
> Subject: Stop Cluster Mode Running App
>
>   I submitted a Spark Application in cluster mode and now every time I
> stop the cluster and restart it the job resumes execution.
>
>  I even killed a daemon called DriverWrapper it stops the app but it
> resumes again.
>
>  How can stop this application from running?
>

Reply via email to