Hey Yang,

Looks like HDFS is having trouble with a block. Have you tried running
hadoop fsck?

-Dima

On Thursday, August 11, 2016, Ming Yang <yangming860...@gmail.com> wrote:

> The cluster enabled shortCircuitLocalReads.
> <property>
>     <name>dfs.client.read.shortcircuit</name>
>     <value>true</value>
> </property>
>
> When enabled replication,we found a large number of error logs.
> 1.shortCircuitLocalReads(fail everytime).
> 2.Try reading via the datanode on targetAddr(success).
> How to make shortCircuitLocalReads successfully when enabled replication?
>
> 2016-08-03 10:46:21,721 DEBUG
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource:
> Opening
> log for replication dn7%2C60020%2C1470136216957.1470192327030 at 16999670
> 2016-08-03 10:46:21,723 WARN org.apache.hadoop.hdfs.DFSClient:
> BlockReaderLocal requested with incorrect offset: Offset 0 and length
> 17073479 don't match block blk_4137524355009640437_53760530 ( blockLen
> 16999670 )
> 2016-08-03 10:46:21,723 WARN org.apache.hadoop.hdfs.DFSClient:
> BlockReaderLocal: Removing blk_4137524355009640437_53760530 from cache
> because local file
> /sdd/hdfs/dfs/data/blocksBeingWritten/blk_4137524355009640437 could not be
> opened.
> 2016-08-03 10:46:21,724 INFO org.apache.hadoop.hdfs.DFSClient: Failed to
> read block blk_4137524355009640437_53760530 on local
> machinejava.io.IOException: Offset 0 and length 17073479 don't match block
> blk_4137524355009640437_53760530 ( blockLen 16999670 )
> at org.apache.hadoop.hdfs.BlockReaderLocal.<init>(
> BlockReaderLocal.java:287)
> at
> org.apache.hadoop.hdfs.BlockReaderLocal.newBlockReader(
> BlockReaderLocal.java:171)
> at org.apache.hadoop.hdfs.DFSClient.getLocalBlockReader(
> DFSClient.java:358)
> at org.apache.hadoop.hdfs.DFSClient.access$800(DFSClient.java:74)
> at
> org.apache.hadoop.hdfs.DFSClient$DFSInputStream.
> blockSeekTo(DFSClient.java:2073)
> at org.apache.hadoop.hdfs.DFSClient$DFSInputStream.read(
> DFSClient.java:2224)
> at java.io.DataInputStream.read(DataInputStream.java:149)
> at java.io.DataInputStream.readFully(DataInputStream.java:195)
> at java.io.DataInputStream.readFully(DataInputStream.java:169)
> at org.apache.hadoop.io.SequenceFile$Reader.init(SequenceFile.java:1508)
> at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1486)
> at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1475)
> at org.apache.hadoop.io.SequenceFile$Reader.<init>(SequenceFile.java:1470)
> at
> org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader$
> WALReader.<init>(SequenceFileLogReader.java:55)
> at
> org.apache.hadoop.hbase.regionserver.wal.SequenceFileLogReader.init(
> SequenceFileLogReader.java:178)
> at org.apache.hadoop.hbase.regionserver.wal.HLog.getReader(HLog.java:734)
> at
> org.apache.hadoop.hbase.replication.regionserver.
> ReplicationHLogReaderManager.openReader(ReplicationHLogReaderManager.
> java:69)
> at
> org.apache.hadoop.hbase.replication.regionserver.
> ReplicationSource.openReader(ReplicationSource.java:574)
> at
> org.apache.hadoop.hbase.replication.regionserver.ReplicationSource.run(
> ReplicationSource.java:364)
> 2016-08-03 10:46:21,724 INFO org.apache.hadoop.hdfs.DFSClient: Try reading
> via the datanode on /192.168.7.139:50010
>


-- 
-Dima

Reply via email to