You can do it from the API, I believe. Call getAssignedJobID() on
org.apache.hadoop.mapred.jobcontrol.Job to get the JobID of the job
you want to kill. Then call new JobClient().getJob(jobId).killJob().
On the new API you can just call killJob() on ControlledJob.
Tom
On Mon, Jul 20, 2009 at 4:45
Hi Tom,
in that case, can i kill the job by givin some command from the
API?? or i ll have 2 do it frm the command line?
On Mon, Jul 20, 2009 at 8:55 PM, Tom White wrote:
> Hi Raakhi,
>
> You can't suspend MapReduce jobs in Hadoop, which is why the
> JobControl API doesn't support jo
Hi Raakhi,
You can't suspend MapReduce jobs in Hadoop, which is why the
JobControl API doesn't support job suspension, only the ability to
kill jobs.
Cheers,
Tom
On Mon, Jul 20, 2009 at 9:39 AM, Rakhi Khatwani wrote:
> Hi,
> I have a scenario in which i have a list of 5 jobs. and an event
>
Hi,
I have a scenario in which i have a list of 5 jobs. and an event
handler for example when triggered, would suspend all the running jobs.
but when i use a job control object, jobControl and execute,
jobControl.suspend(), it seems tht only the jobControl gets suspended and
not the 5 jobs wh
Hi Raakhi,
JobControl is designed to be run from a new thread:
Thread t = new Thread(jobControl);
t.start();
Then you can run a loop to poll for job completion and print out status:
String oldStatus = null;
while (!jobControl.allFinished()) {
String status = getStatusString(jobCon
Hi,
I was trying out a map-reduce example using JobControl.
i create a jobConf conf1 object, add the necessary information
then i create a job object
Job job1 = new Job(conf1);
n thn i delare JobControl object as follows:
JobControl jobControl = new JobControl("JobControl1");