and still not find which block corrupt ,i search keyword 'orrupt' ,only get

/hbase/.corrupt <dir>     ,but it's a dir not corrupt block

On Sat, Jan 25, 2014 at 6:31 PM, Shekhar Sharma <shekhar2...@gmail.com>wrote:

> Run fsck command
>
> hadoop fsck 《path》-files  -blocks   -locations
>  On 25 Jan 2014 08:04, "ch huang" <justlo...@gmail.com> wrote:
>
>> hi,maillist:
>>            this morning nagios alert hadoop has corrupt block ,i checked
>> it use "
>> hdfs dfsadmin -report" ,from it output ,it did has corrupt blocks
>>
>> Configured Capacity: 53163259158528 (48.35 TB)
>> Present Capacity: 50117251458834 (45.58 TB)
>> DFS Remaining: 45289289015296 (41.19 TB)
>> DFS Used: 4827962443538 (4.39 TB)
>> DFS Used%: 9.63%
>> Under replicated blocks: 277
>> Blocks with corrupt replicas: 2
>> Missing blocks: 0
>>
>> but i dump all metadata use "
>> # sudo -u hdfs hdfs dfsadmin -metasave"
>>
>> and loock the record which "c: not 0" i can not find any block with
>> corrupt replicas,why?
>>
>

Reply via email to