Re: How to stop a mapreduce job from terminal running on Hadoop Cluster?

2015-04-12 Thread unmesha sreeveni
you can use $ hadoop job -kill jobid On Mon, Apr 13, 2015 at 10:20 AM, Rohith Sharma K S rohithsharm...@huawei.com wrote: In addition to below options, in the Hadoop-2.7(yet to release in couple of weeks) the user friendly option provided for killing the applications from Web UI. In

RE: How to stop a mapreduce job from terminal running on Hadoop Cluster?

2015-04-12 Thread Rohith Sharma K S
In addition to below options, in the Hadoop-2.7(yet to release in couple of weeks) the user friendly option provided for killing the applications from Web UI. In the application block , ‘Kill Application’ button has been provided for killing applications. Thanks Regards Rohith Sharma K S

How to stop a mapreduce job from terminal running on Hadoop Cluster?

2015-04-12 Thread Answer Agrawal
To run a job we use the command $ hadoop jar example.jar inputpath outputpath If job is so time taken and we want to stop it in middle then which command is used? Or is there any other way to do that? Thanks,

Re: How to stop a mapreduce job from terminal running on Hadoop Cluster?

2015-04-12 Thread Pradeep Gollakota
Also, mapred job -kill job_id On Sun, Apr 12, 2015 at 11:07 AM, Shahab Yunus shahab.yu...@gmail.com wrote: You can kill t by using the following yarn command yarn application -kill application id https://hadoop.apache.org/docs/r2.2.0/hadoop-yarn/hadoop-yarn-site/YarnCommands.html Or use

Re: How to stop a mapreduce job from terminal running on Hadoop Cluster?

2015-04-12 Thread Shahab Yunus
You can kill t by using the following yarn command yarn application -kill application id https://hadoop.apache.org/docs/r2.2.0/hadoop-yarn/hadoop-yarn-site/YarnCommands.html Or use old hadoop job command http://stackoverflow.com/questions/11458519/how-to-kill-hadoop-jobs Regards, Shahab On