Thanks for your replies.
Actually we can kill a driver by the command bin/spark-class
org.apache.spark.deploy.Client kill spark-master driver-id if you know
the driver id.
2014-11-11 22:35 GMT+08:00 Ritesh Kumar Singh riteshoneinamill...@gmail.com
:
There is a property :
I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode,
which means the driver is not running on local node.
So how can I kill such a job? Is there a command like hadoop job -kill
job-id which kills a running MapReduce job ?
Thanks
There is a property :
spark.ui.killEnabled
which needs to be set true for killing applications directly from the webUI.
Check the link:
Kill Enable spark job
http://spark.apache.org/docs/latest/configuration.html#spark-ui
Thanks
On Tue, Nov 11, 2014 at 7:42 PM, Sonal Goyal