Can you try like following..?

hdfs fsck -openforwrite -files -blocks -locations / | grep 
blk_1109280129_1099547327549



Thanks & Regards

 Brahma Reddy Battula


________________________________
From: Adnan Karač [adnanka...@gmail.com]
Sent: Tuesday, May 26, 2015 1:34 PM
To: user@hadoop.apache.org
Subject: Cannot obtain block length for LocatedBlock

Hi all,

I have an MR job running and exiting with following exception.

java.io.IOException: Cannot obtain block length for LocatedBlock
{BP-1632531813-172.19.67.67-1393407344218:blk_1109280129_1099547327549; 
getBlockSize()=139397; corrupt=false; offset=0; 
locs=[172.19.67.67:50010<http://172.19.67.67:50010>, 
172.19.67.78:50010<http://172.19.67.78:50010>, 
172.19.67.84:50010<http://172.19.67.84:50010>]}

Now, the fun part is that i don't know which file is in question. In order to 
find this out, i did this:

hdfs fsck -files -blocks  / | grep blk_1109280129_1099547327549

Interestingly enough, it came up with nothing.

Did anyone experience anything similar? Or does anyone have a piece of advice 
on how to resolve this?

Version of hadoop is 2.3.0

Thanks in advance!

--
Adnan Karač
[https://mailfoogae.appspot.com/t?sender=aYWRuYW5rYXJhY0BnbWFpbC5jb20%3D&type=zerocontent&guid=316827dc-8cb2-45d7-a776-5c8b1d11bc17]ᐧ

Reply via email to