RE: How to do HADOOP RECOVERY ???

2012-10-29 Thread Uma Maheswara Rao G
ay, October 29, 2012 5:43 PM To: user@hadoop.apache.org Subject: RE: How to do HADOOP RECOVERY ??? Hi Uma, You are correct, when I start cluster it goes into safemode and if I do wait its doesn't come out. I use -safemode leave option. Safe mode is ON. The ratio of reported blocks 0

RE: How to do HADOOP RECOVERY ???

2012-10-29 Thread yogesh.kumar13
namespace has this info for this file. But DNs does not have any block related to it. From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com] Sent: Monday, October 29, 2012 4:13 PM To: user@hadoop.apache.org Subject: RE: How to do HADOOP RECOVERY ??? Thanks Uma, I am using hadoop-0.20.2 versio

RE: How to do HADOOP RECOVERY ???

2012-10-29 Thread Uma Maheswara Rao G
_ From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com] Sent: Monday, October 29, 2012 4:13 PM To: user@hadoop.apache.org Subject: RE: How to do HADOOP RECOVERY ??? Thanks Uma, I am using hadoop-0.20.2 version. UI shows. Cluster Summary 379 files and directories, 270 bl

Re: How to do HADOOP RECOVERY ???

2012-10-29 Thread Bejoy KS
. Regards Bejoy KS Sent from handheld, please excuse typos. -Original Message- From: Date: Mon, 29 Oct 2012 10:43:44 To: Reply-To: user@hadoop.apache.org Subject: RE: How to do HADOOP RECOVERY ??? Thanks Uma, I am using hadoop-0.20.2 version. UI shows. Cluster Summary 379 files and

RE: How to do HADOOP RECOVERY ???

2012-10-29 Thread yogesh.kumar13
. . Please suggest Regards Yogesh Kumar From: Uma Maheswara Rao G [mahesw...@huawei.com] Sent: Monday, October 29, 2012 3:52 PM To: user@hadoop.apache.org Subject: RE: How to do HADOOP RECOVERY ??? Which version of Hadoop are you using? Do you have all DNs

RE: How to do HADOOP RECOVERY ???

2012-10-29 Thread Uma Maheswara Rao G
From: yogesh.kuma...@wipro.com [yogesh.kuma...@wipro.com] Sent: Monday, October 29, 2012 2:03 PM To: user@hadoop.apache.org Subject: How to do HADOOP RECOVERY ??? Hi All, I run this command hadoop fsck -Ddfs.http.address=localhost:50070 / and found that some blocks are missing and corrupted

How to do HADOOP RECOVERY ???

2012-10-29 Thread yogesh.kumar13
Hi All, I run this command hadoop fsck -Ddfs.http.address=localhost:50070 / and found that some blocks are missing and corrupted results comes like.. /user/hive/warehouse/tt_report_htcount/00_0: MISSING 2 blocks of total size 71826120 B.. /user/hive/warehouse/tt_report_perhour_hit/00_