Hi,

We were hitting file descriptor limits :). Increased it and got solved.

Thanks Jason

-Sagar


Sagar Naik wrote:
Hi,
We have a strange problem on getting out some of our files

bin/hadoop dfs -text dir/*  gives me missing block exceptions.
0/8/11/04 10:45:09 [main] INFO dfs.DFSClient: Could not obtain block blk_6488385702283300787_1247408 from any node: java.io.IOException: No live nodes contain current block 08/11/04 10:45:12 [main] INFO dfs.DFSClient: Could not obtain block blk_6488385702283300787_1247408 from any node: java.io.IOException: No live nodes contain current block 08/11/04 10:45:15 [main] INFO dfs.DFSClient: Could not obtain block blk_6488385702283300787_1247408 from any node: java.io.IOException: No live nodes contain current block 08/11/04 10:45:18 [main] WARN dfs.DFSClient: DFS Read: java.io.IOException: Could not obtain block: blk_6488385702283300787_1247408 file=some_filepath-1 at org.apache.hadoop.dfs.DFSClient$DFSInputStream.chooseDataNode(DFSClient.java:1462) at org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(DFSClient.java:1312) at org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1417) at org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(DFSClient.java:1369)
at java.io.DataInputStream.readShort(DataInputStream.java:295)
at org.apache.hadoop.fs.FsShell.forMagic(FsShell.java:396)
at org.apache.hadoop.fs.FsShell.access$1(FsShell.java:394)
at org.apache.hadoop.fs.FsShell$2.process(FsShell.java:419)
at org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(FsShell.java:1865)
at org.apache.hadoop.fs.FsShell.text(FsShell.java:421)
at org.apache.hadoop.fs.FsShell.doall(FsShell.java:1532)
at org.apache.hadoop.fs.FsShell.run(FsShell.java:1730)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.hadoop.fs.FsShell.main(FsShell.java:1847)/

but when I do a
bin/hadoop dfs -text some_filepath-1. I do get all the data


the fsck on this parent of this file revealed no problems.



jstack on FSshell revealed nothin much

/Debugger attached successfully.
Server compiler detected.
JVM version is 10.0-b19
Deadlock Detection:

No deadlocks found.

Thread 3358: (state = BLOCKED)
- java.lang.Thread.sleep(long) @bci=0 (Interpreted frame)
- org.apache.hadoop.dfs.DFSClient$LeaseChecker.run() @bci=124, line=792 (Interpreted frame)
- java.lang.Thread.run() @bci=11, line=619 (Interpreted frame)


Thread 3357: (state = BLOCKED)
- java.lang.Object.wait(long) @bci=0 (Interpreted frame)
- org.apache.hadoop.ipc.Client$Connection.waitForWork() @bci=62, line=397 (Interpreted frame) - org.apache.hadoop.ipc.Client$Connection.run() @bci=63, line=440 (Interpreted frame)


Thread 3342: (state = BLOCKED)


Thread 3341: (state = BLOCKED)
- java.lang.Object.wait(long) @bci=0 (Interpreted frame)
- java.lang.ref.ReferenceQueue.remove(long) @bci=44, line=116 (Interpreted frame) - java.lang.ref.ReferenceQueue.remove() @bci=2, line=132 (Interpreted frame) - java.lang.ref.Finalizer$FinalizerThread.run() @bci=3, line=159 (Interpreted frame)


Thread 3340: (state = BLOCKED)
- java.lang.Object.wait(long) @bci=0 (Interpreted frame)
- java.lang.Object.wait() @bci=2, line=485 (Interpreted frame)
- java.lang.ref.Reference$ReferenceHandler.run() @bci=46, line=116 (Interpreted frame)


Thread 3330: (state = BLOCKED)
- java.lang.Thread.sleep(long) @bci=0 (Interpreted frame)
- org.apache.hadoop.dfs.DFSClient$DFSInputStream.chooseDataNode(org.apache.hadoop.dfs.LocatedBlock) @bci=181, line=1470 (Interpreted frame) - org.apache.hadoop.dfs.DFSClient$DFSInputStream.blockSeekTo(long) @bci=133, line=1312 (Interpreted frame) - org.apache.hadoop.dfs.DFSClient$DFSInputStream.read(byte[], int, int) @bci=61, line=1417 (Interpreted frame) - org.apache.hadoop.dfs.DFSClient$DFSInputStream.read() @bci=7, line=1369 (Compiled frame)
- java.io.DataInputStream.readShort() @bci=4, line=295 (Compiled frame)
- org.apache.hadoop.fs.FsShell.forMagic(org.apache.hadoop.fs.Path, org.apache.hadoop.fs.FileSystem) @bci=7, line=396 (Interpreted frame) - org.apache.hadoop.fs.FsShell.access$1(org.apache.hadoop.fs.FsShell, org.apache.hadoop.fs.Path, org.apache.hadoop.fs.FileSystem) @bci=3, line=394 (Interpreted frame) - org.apache.hadoop.fs.FsShell$2.process(org.apache.hadoop.fs.Path, org.apache.hadoop.fs.FileSystem) @bci=28, line=419 (Interpreted frame) - org.apache.hadoop.fs.FsShell$DelayedExceptionThrowing.globAndProcess(org.apache.hadoop.fs.Path, org.apache.hadoop.fs.FileSystem) @bci=40, line=1865 (Interpreted frame) - org.apache.hadoop.fs.FsShell.text(java.lang.String) @bci=26, line=421 (Interpreted frame) - org.apache.hadoop.fs.FsShell.doall(java.lang.String, java.lang.String[], int) @bci=246, line=1532 (Interpreted frame) - org.apache.hadoop.fs.FsShell.run(java.lang.String[]) @bci=586, line=1730 (Interpreted frame) - org.apache.hadoop.util.ToolRunner.run(org.apache.hadoop.conf.Configuration, org.apache.hadoop.util.Tool, java.lang.String[]) @bci=38, line=65 (Interpreted frame) - org.apache.hadoop.util.ToolRunner.run(org.apache.hadoop.util.Tool, java.lang.String[]) @bci=8, line=79 (Interpreted frame) - org.apache.hadoop.fs.FsShell.main(java.lang.String[]) @bci=10, line=1847 (Interpreted frame)



/In UI,
I checked for the file. It did list the file and was able to print blocks.



Reply via email to