Hi Folks,

I was following the instruction at
http://wiki.apache.org/hadoop/Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)

The problem is when I try the command 'bin/hadoop dfs -copyFromLocal'
at the shell, I get a IOException as given below. Could someone tell
me what could be the issue?

==== START EXCEPTION ====
08/08/11 14:48:05 INFO dfs.DFSClient:
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
/user/jayant/README.txt could only be replicated to 0 nodes, instead
of 1
        at 
org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)

        at org.apache.hadoop.ipc.Client.call(Client.java:557)
        at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:212)
        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:82)
        at 
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:59)
        at org.apache.hadoop.dfs.$Proxy0.addBlock(Unknown Source)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.locateFollowingBlock(DFSClient.java:2334)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.nextBlockOutputStream(DFSClient.java:2219)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream.access$1700(DFSClient.java:1702)
        at 
org.apache.hadoop.dfs.DFSClient$DFSOutputStream$DataStreamer.run(DFSClient.java:1842)

08/08/11 14:48:05 WARN dfs.DFSClient: NotReplicatedYetException
sleeping /user/jayant/README.txt retries left 1
08/08/11 14:48:08 WARN dfs.DFSClient: DataStreamer Exception:
org.apache.hadoop.ipc.RemoteException: java.io.IOException: File
/user/jayant/README.txt could only be replicated to 0 nodes, instead
of 1
        at 
org.apache.hadoop.dfs.FSNamesystem.getAdditionalBlock(FSNamesystem.java:1145)
        at org.apache.hadoop.dfs.NameNode.addBlock(NameNode.java:300)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
        at java.lang.reflect.Method.invoke(Method.java:597)
        at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:446)
        at org.apache.hadoop.ipc.Server$Handler.run(Server.java:896)

08/08/11 14:48:08 WARN dfs.DFSClient: Error Recovery for block null
bad datanode[0]
==== END EXCEPTION ====


Thanks and Best Regards,
Jayant

-- 
www.jkg.in | http://www.jkg.in/contact-me/
Jayant Kr. Gandhi

Reply via email to