fail and kill all tasks without killing job.

2012-07-20 Thread jay vyas
Hi guys : I want my tasks to end/fail, but I don't want to kill my 
entire hadoop job.


I have a hadoop job that runs 5 hadoop jobs in a row.
Im on the last of those sub-jobs, and want to fail all tasks so that the 
task tracker stops delegating them,

and the hadoop main job can naturally come to a close.

However, when I run "hadoop job kill-attempt / fail-attempt ", the 
jobtracker seems to simply relaunch

the same tasks with new ids.

How can I tell the jobtracker to give up on redelegating?


Re: fail and kill all tasks without killing job.

2012-07-20 Thread Bejoy KS
Hi Jay

Did you try
hadoop job -kill-task  ? And is that not working as desired? 

Regards
Bejoy KS

Sent from handheld, please excuse typos.

-Original Message-
From: jay vyas 
Date: Fri, 20 Jul 2012 17:17:58 
To: common-user@hadoop.apache.org
Reply-To: common-user@hadoop.apache.org
Subject: fail and kill all tasks without killing job.

Hi guys : I want my tasks to end/fail, but I don't want to kill my 
entire hadoop job.

I have a hadoop job that runs 5 hadoop jobs in a row.
Im on the last of those sub-jobs, and want to fail all tasks so that the 
task tracker stops delegating them,
and the hadoop main job can naturally come to a close.

However, when I run "hadoop job kill-attempt / fail-attempt ", the 
jobtracker seems to simply relaunch
the same tasks with new ids.

How can I tell the jobtracker to give up on redelegating?



Re: fail and kill all tasks without killing job.

2012-07-20 Thread JAX
I believe that kill-task simple kills the task, but then the same process (i.e. 
"task") starts, with a new id.

Jay Vyas 
MMSB
UCHC

On Jul 20, 2012, at 6:23 PM, "Bejoy KS"  wrote:

> Hi Jay
> 
> Did you try
> hadoop job -kill-task  ? And is that not working as desired? 
> 
> Regards
> Bejoy KS
> 
> Sent from handheld, please excuse typos.
> 
> -Original Message-
> From: jay vyas 
> Date: Fri, 20 Jul 2012 17:17:58 
> To: common-user@hadoop.apache.org
> Reply-To: common-user@hadoop.apache.org
> Subject: fail and kill all tasks without killing job.
> 
> Hi guys : I want my tasks to end/fail, but I don't want to kill my 
> entire hadoop job.
> 
> I have a hadoop job that runs 5 hadoop jobs in a row.
> Im on the last of those sub-jobs, and want to fail all tasks so that the 
> task tracker stops delegating them,
> and the hadoop main job can naturally come to a close.
> 
> However, when I run "hadoop job kill-attempt / fail-attempt ", the 
> jobtracker seems to simply relaunch
> the same tasks with new ids.
> 
> How can I tell the jobtracker to give up on redelegating?
> 


Re: fail and kill all tasks without killing job.

2012-07-20 Thread Harsh J
Hi Jay,

Fail a single task four times (default), and the job will be marked as
failed. Is that what you're looking for?

Or if you wanted your job to have succeeded even if not all tasks
succeeded, tweak the "mapred.max.map/reduce.failures.percent" property
in your job (by default it expects 0% failures, so set a number
between 0-1 that is acceptable for you).

To then avoid having to do it four times for a single task, lower
"mapred.map/reduce.max.attempts" down from its default of 4.

Does this answer your question?

On Sat, Jul 21, 2012 at 2:47 AM, jay vyas  wrote:
> Hi guys : I want my tasks to end/fail, but I don't want to kill my entire
> hadoop job.
>
> I have a hadoop job that runs 5 hadoop jobs in a row.
> Im on the last of those sub-jobs, and want to fail all tasks so that the
> task tracker stops delegating them,
> and the hadoop main job can naturally come to a close.
>
> However, when I run "hadoop job kill-attempt / fail-attempt ", the
> jobtracker seems to simply relaunch
> the same tasks with new ids.
>
> How can I tell the jobtracker to give up on redelegating?



-- 
Harsh J