The web interface has a kill link. You can try using that.

Best Regards,
Sonal
Founder, Nube Technologies <http://www.nubetech.co>

<http://in.linkedin.com/in/sonalgoyal>



On Tue, Nov 11, 2014 at 7:28 PM, Tao Xiao <xiaotao.cs....@gmail.com> wrote:

> I'm using Spark 1.0.0 and I'd like to kill a job running in cluster mode,
> which means the driver is not running on local node.
>
> So how can I kill such a job? Is there a command like "hadoop job -kill
> <job-id>" which kills a running MapReduce job ?
>
> Thanks
>

Reply via email to