Hello Kiran,

 Can you check that block presents in DN and check the generation timestamp in 
metafile(if you are aware of it)?
 
 Can you grep the blk_-8354424441116992221 from your logs and paste here?
 We have seen this when recovery is in progress and read parallelly(in 0.20x 
versions). If this problem is because of recovery, you should be able read the 
file in next attempt.

we can make out the scenario beased on your grep rsult from logs.

Thanks & Regards,
Uma
----- Original Message -----
From: kiranprasad <kiranprasa...@imimobile.com>
Date: Friday, October 7, 2011 2:18 pm
Subject: ERROR 1066: Unable to open iterator for alias A. Backend error : Could 
not obtain block:
To: hdfs-user@hadoop.apache.org

> Hi  
> 
> 
> I ve checked with below mentioned command  and I am getting
> 
> [kiranprasad.g@pig4 hadoop-0.20.2]$ bin/hadoop fs -text 
> /data/arpumsisdn.txt | tail
> 11/10/07 16:17:18 INFO hdfs.DFSClient: No node available for 
> block: 
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:18 INFO hdfs.DFSClient: Could not obtain block 
> blk_-8354424441116992221_1060 from any node:  java.io.IOException: 
> No live 
> nodes contain current block
> 11/10/07 16:17:21 INFO hdfs.DFSClient: No node available for 
> block: 
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:21 INFO hdfs.DFSClient: Could not obtain block 
> blk_-8354424441116992221_1060 from any node:  java.io.IOException: 
> No live 
> nodes contain current block
> 11/10/07 16:17:25 INFO hdfs.DFSClient: No node available for 
> block: 
> blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> 11/10/07 16:17:25 INFO hdfs.DFSClient: Could not obtain block 
> blk_-8354424441116992221_1060 from any node:  java.io.IOException: 
> No live 
> nodes contain current block
> 11/10/07 16:17:29 WARN hdfs.DFSClient: DFS Read: 
> java.io.IOException: Could 
> not obtain block: blk_-8354424441116992221_1060 
> file=/data/arpumsisdn.txt        at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1812)
>        at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1638)
>        at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1767)
>        at 
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(DFSClient.java:1695)
>        at java.io.DataInputStream.readShort(DataInputStream.java:295)
>        at org.apache.hadoop.fs.FsShell.forMagic(FsShell.java:397)
>        at org.apache.hadoop.fs.FsShell.access$200(FsShell.java:49)
>        at org.apache.hadoop.fs.FsShell$2.process(FsShell.java:420)
>        at 
> org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsShell.java:1898)
>        at org.apache.hadoop.fs.FsShell.text(FsShell.java:414)
>        at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1563)
>        at org.apache.hadoop.fs.FsShell.run(FsShell.java:1763)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
>        at org.apache.hadoop.fs.FsShell.main(FsShell.java:1880)
> 
> text: Could not obtain block: blk_-8354424441116992221_1060 
> file=/data/arpumsisdn.txt
> 
> 
> The Block is not available.  How to recover the data block ?
> 
> 
> -----Original Message----- 
> From: Alex Rovner
> Sent: Wednesday, October 05, 2011 5:55 PM
> To: u...@pig.apache.org
> Subject: Re: ERROR 1066: Unable to open iterator for alias A. 
> Backend error 
> : Could not obtain block:
> 
> You can also test quickly if thats the issue by running the following
> command:
> 
> hadoop fs -text /data/arpumsisdn.txt | tail
> 
> On Wed, Oct 5, 2011 at 8:24 AM, Alex Rovner <alexrov...@gmail.com> 
> wrote:
> Kiran,
> 
> This looks like your HDFS is missing some blocks. Can you run fsck 
> and see
> if you have missing blocks and if so for what files?
> 
> http://hadoop.apache.org/common/docs/r0.17.2/hdfs_user_guide.html#Fsck
> 
> Alex
> 
> 
> On Tue, Oct 4, 2011 at 7:53 AM, kiranprasad 
> <kiranprasa...@imimobile.com>wrote:
> 
> I am getting the below exception when trying to execute PIG latin 
> script.
> Failed!
> 
> Failed Jobs:
> JobId   Alias   Feature Message Outputs
> job_201110042009_0005   A       MAP_ONLY        Message: Job failed!
> hdfs://10.0.0.61/tmp/temp1751671187/tmp-592386019,
> 
> Input(s):
> Failed to read data from "/data/arpumsisdn.txt"
> 
> Output(s):
> Failed to produce result in "hdfs://
> 10.0.0.61/tmp/temp1751671187/tmp-592386019"
> 
> Counters:
> Total records written : 0
> Total bytes written : 0
> Spillable Memory Manager spill count : 0
> Total bags proactively spilled: 0
> Total records proactively spilled: 0
> 
> Job DAG:
> job_201110042009_0005
> 
> 
> 2011-10-04 22:13:53,736 [main] INFO
> 
> org.apache.pig.backend.hadoop.executionengine.mapReduceLayer.MapReduceLauncher
> - Failed!
> 2011-10-04 22:13:53,745 [main] ERROR 
> org.apache.pig.tools.grunt.Grunt -
> ERROR 1066: Unable to open iterator for alias A. Backend error : 
> Could 
> not
> obtain block: blk_-8354424441116992221_1060 file=/data/arpumsisdn.txt
> Details at logfile: /home/kiranprasad.g/pig-
> 0.8.1/pig_1317746514798.log
> 
> 
> Regards
> Kiran.G

Reply via email to