Hi guys : I want my tasks to end/fail, but I don't want to kill my
entire hadoop job.
I have a hadoop job that runs 5 hadoop jobs in a row.
Im on the last of those sub-jobs, and want to fail all tasks so that the
task tracker stops delegating them,
and the hadoop main job can naturally come
@hadoop.apache.org
Reply-To: common-user@hadoop.apache.org
Subject: fail and kill all tasks without killing job.
Hi guys : I want my tasks to end/fail, but I don't want to kill my
entire hadoop job.
I have a hadoop job that runs 5 hadoop jobs in a row.
Im on the last of those sub-jobs, and want to fail
?
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-Original Message-
From: jay vyas jayunit...@gmail.com
Date: Fri, 20 Jul 2012 17:17:58
To: common-user@hadoop.apache.orgcommon-user@hadoop.apache.org
Reply-To: common-user@hadoop.apache.org
Subject: fail and kill all tasks
Hi Jay,
Fail a single task four times (default), and the job will be marked as
failed. Is that what you're looking for?
Or if you wanted your job to have succeeded even if not all tasks
succeeded, tweak the mapred.max.map/reduce.failures.percent property
in your job (by default it expects 0%