[ 
https://issues.apache.org/jira/browse/HDFS-9220?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Bogdan Raducanu updated HDFS-9220:
----------------------------------
    Description: 
Exception:
2015-10-09 14:59:40 WARN  DFSClient:1150 - fetchBlockByteRange(). Got a 
checksum exception for /tmp/file0.05355529331575182 at 
BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0 from 
DatanodeInfoWithStorage[10.10.10.10]:5001

All 3 replicas cause this exception and the read fails entirely with:
BlockMissingException: Could not obtain block: 
BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882 
file=/tmp/file0.05355529331575182

Code to reproduce is attached.
Does not happen in 2.7.0.
Data is read correctly if checksum verification is disabled.
More generally, the failure happens when reading from the last block of a file 
and the last block has <= 512 bytes.

  was:
Exception:
2015-10-09 14:59:40 WARN  DFSClient:1150 - fetchBlockByteRange(). Got a 
checksum exception for /tmp/file0.05355529331575182 at 
BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0 from 
DatanodeInfoWithStorage[10.10.10.10]:5001

All 3 replicas cause this exception and the read fails entirely with:
BlockMissingException: Could not obtain block: 
BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882 
file=/tmp/file0.05355529331575182

Code to reproduce is attached.
Does not happen in 2.7.0.
Data is read correctly if checksum verification is disabled.


> Reading small file (< 512 bytes) that is open for append fails due to 
> incorrect checksum
> ----------------------------------------------------------------------------------------
>
>                 Key: HDFS-9220
>                 URL: https://issues.apache.org/jira/browse/HDFS-9220
>             Project: Hadoop HDFS
>          Issue Type: Bug
>    Affects Versions: 2.7.1
>            Reporter: Bogdan Raducanu
>            Assignee: Jing Zhao
>            Priority: Blocker
>         Attachments: HDFS-9220.000.patch, HDFS-9220.001.patch, test2.java
>
>
> Exception:
> 2015-10-09 14:59:40 WARN  DFSClient:1150 - fetchBlockByteRange(). Got a 
> checksum exception for /tmp/file0.05355529331575182 at 
> BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882:0 from 
> DatanodeInfoWithStorage[10.10.10.10]:5001
> All 3 replicas cause this exception and the read fails entirely with:
> BlockMissingException: Could not obtain block: 
> BP-353681639-10.10.10.10-1437493596883:blk_1075692769_9244882 
> file=/tmp/file0.05355529331575182
> Code to reproduce is attached.
> Does not happen in 2.7.0.
> Data is read correctly if checksum verification is disabled.
> More generally, the failure happens when reading from the last block of a 
> file and the last block has <= 512 bytes.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to