[ https://issues.apache.org/jira/browse/HDFS-3179?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13245119#comment-13245119 ]
amith commented on HDFS-3179: ----------------------------- Hi Zhanwei Wang I exactly dont know about your test script does, but this look similar to HDFS-3091. can u check this once https://issues.apache.org/jira/browse/HDFS-3091 Please correct me If I am wrong :) > failed to append data, DataStreamer throw an exception, "nodes.length != > original.length + 1" on single datanode cluster > ------------------------------------------------------------------------------------------------------------------------ > > Key: HDFS-3179 > URL: https://issues.apache.org/jira/browse/HDFS-3179 > Project: Hadoop HDFS > Issue Type: Bug > Components: data-node > Affects Versions: 0.23.2 > Reporter: Zhanwei.Wang > Priority: Critical > > Create a single datanode cluster > disable permissions > enable webhfds > start hdfs > run the test script > expected result: > a file named "test" is created and the content is "testtest" > the result I got: > hdfs throw an exception on the second append operation. > {code} > ./test.sh > {"RemoteException":{"exception":"IOException","javaClassName":"java.io.IOException","message":"Failed > to add a datanode: nodes.length != original.length + 1, > nodes=[127.0.0.1:50010], original=[127.0.0.1:50010]"}} > {code} > Log in datanode: > {code} > 2012-04-02 14:34:21,058 WARN org.apache.hadoop.hdfs.DFSClient: DataStreamer > Exception > java.io.IOException: Failed to add a datanode: nodes.length != > original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) > 2012-04-02 14:34:21,059 ERROR org.apache.hadoop.hdfs.DFSClient: Failed to > close file /test > java.io.IOException: Failed to add a datanode: nodes.length != > original.length + 1, nodes=[127.0.0.1:50010], original=[127.0.0.1:50010] > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.findNewDatanode(DFSOutputStream.java:778) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.addDatanode2ExistingPipeline(DFSOutputStream.java:834) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.setupPipelineForAppendOrRecovery(DFSOutputStream.java:930) > at > org.apache.hadoop.hdfs.DFSOutputStream$DataStreamer.run(DFSOutputStream.java:461) > {code} > test.sh > {code} > #!/bin/sh > echo "test" > test.txt > curl -L -X PUT "http://localhost:50070/webhdfs/v1/test?op=CREATE" > curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" > curl -L -X POST -T test.txt "http://localhost:50070/webhdfs/v1/test?op=APPEND" > {code} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira