Hi, All I just start to use Hadoop few days ago. I met the error message " WARN hdfs.DFSClient: DataStreamer Exception: org.apache.hadoop.ipc.RemoteException: java.io.IOException: File /user/hadoop/count/count/temp1 could only be replicated to 0 nodes, instead of 1" while trying to copy data files to DFS after Hadoop is started.
I did all the settings according to the "Running_Hadoop_On_Ubuntu_Linux_(Single-Node_Cluster)"'s instruction, and I don't know what's wrong. Besides, during the process, no error message is written to log files. Also, according to "http://localhost.localdomain:50070/dfshealth.jsp", I have one live namenode. By the broswer, I even can see the first data file is created in DFS, but the size of it is 0. Things I've tried: 1. Stop hadoop, re-format DFS and start hadoop again. 2. Change "localhost" to "127.0.0.1" But neigher of them works. Could anyone help me or give me a hint? Thanks. Anthony -- View this message in context: http://www.nabble.com/could-only-be-replicated-to-0-nodes%2C-instead-of-1-tp24459104p24459104.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.