ay, October 29, 2012 5:43 PM
To: user@hadoop.apache.org
Subject: RE: How to do HADOOP RECOVERY ???
Hi Uma,
You are correct, when I start cluster it goes into safemode and if I do wait
its doesn't come out.
I use -safemode leave option.
Safe mode is ON. The ratio of reported blocks 0
namespace has this info for this file. But DNs does not
have any block related to it.
From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com]
Sent: Monday, October 29, 2012 4:13 PM
To: user@hadoop.apache.org
Subject: RE: How to do HADOOP RECOVERY ???
Thanks Uma,
I am using hadoop-0.20.2 versio
_
From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com]
Sent: Monday, October 29, 2012 4:13 PM
To: user@hadoop.apache.org
Subject: RE: How to do HADOOP RECOVERY ???
Thanks Uma,
I am using hadoop-0.20.2 version.
UI shows.
Cluster Summary
379 files and directories, 270 bl
.
Regards
Bejoy KS
Sent from handheld, please excuse typos.
-Original Message-
From:
Date: Mon, 29 Oct 2012 10:43:44
To:
Reply-To: user@hadoop.apache.org
Subject: RE: How to do HADOOP RECOVERY ???
Thanks Uma,
I am using hadoop-0.20.2 version.
UI shows.
Cluster Summary
379 files and
.
.
Please suggest
Regards
Yogesh Kumar
From: Uma Maheswara Rao G [mahesw...@huawei.com]
Sent: Monday, October 29, 2012 3:52 PM
To: user@hadoop.apache.org
Subject: RE: How to do HADOOP RECOVERY ???
Which version of Hadoop are you using?
Do you have all DNs
From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com]
Sent: Monday, October 29, 2012 2:03 PM
To: user@hadoop.apache.org
Subject: How to do HADOOP RECOVERY ???
Hi All,
I run this command
hadoop fsck -Ddfs.http.address=localhost:50070 /
and found that some blocks are missing and corrupted
Hi All,
I run this command
hadoop fsck -Ddfs.http.address=localhost:50070 /
and found that some blocks are missing and corrupted
results comes like..
/user/hive/warehouse/tt_report_htcount/00_0: MISSING 2 blocks of total size
71826120 B..
/user/hive/warehouse/tt_report_perhour_hit/00_