[ https://issues.apache.org/jira/browse/HDFS-3788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13437301#comment-13437301 ]
Hudson commented on HDFS-3788: ------------------------------ Integrated in Hadoop-Hdfs-0.23-Build #347 (See [https://builds.apache.org/job/Hadoop-Hdfs-0.23-Build/347/]) svn merge -c 1374122. FIXES: HDFS-3788. ByteRangeInputStream should not expect HTTP Content-Length header when chunked transfer-encoding is used. (Revision 1374406) Result = SUCCESS daryn : http://svn.apache.org/viewcvs.cgi/?root=Apache-SVN&view=rev&rev=1374406 Files : * /hadoop/common/branches/branch-0.23/hadoop-hdfs-project/hadoop-hdfs/CHANGES.txt * /hadoop/common/branches/branch-0.23/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/ByteRangeInputStream.java > distcp can't copy large files using webhdfs due to missing Content-Length > header > -------------------------------------------------------------------------------- > > Key: HDFS-3788 > URL: https://issues.apache.org/jira/browse/HDFS-3788 > Project: Hadoop HDFS > Issue Type: Bug > Components: webhdfs > Affects Versions: 0.23.3, 2.0.0-alpha > Reporter: Eli Collins > Assignee: Tsz Wo (Nicholas), SZE > Priority: Critical > Fix For: 0.23.3, 2.2.0-alpha > > Attachments: 20120814NullEntity.patch, distcp-webhdfs-errors.txt, > h3788_20120813.patch, h3788_20120814b.patch, h3788_20120814.patch, > h3788_20120815.patch, h3788_20120816.patch > > > The following command fails when data1 contains a 3gb file. It passes when > using hftp or when the directory just contains smaller (<2gb) files, so looks > like a webhdfs issue with large files. > {{hadoop distcp webhdfs://eli-thinkpad:50070/user/eli/data1 > hdfs://localhost:8020/user/eli/data2}} -- This message is automatically generated by JIRA. If you think it was sent incorrectly, please contact your JIRA administrators: https://issues.apache.org/jira/secure/ContactAdministrators!default.jspa For more information on JIRA, see: http://www.atlassian.com/software/jira