Hi,

I think there is some block synchronization issue in your hdfs cluster.
Frankly i haven't face this issue yet.

I believe you need to refresh your namenode fsimage to make it up to date
with your datanodes.

Thanks.
On Wed, Jul 31, 2013 at 6:16 AM, ch huang <justlo...@gmail.com> wrote:

> thanks for reply, i the block did not exist ,but why it will missing?
>
>
> On Wed, Jul 31, 2013 at 2:02 AM, Jitendra Yadav <
> jeetuyadav200...@gmail.com> wrote:
>
>> Hi,
>>
>> Can you please check the existence/status  of any of mentioned block
>> in your hdfs cluster.
>>
>> Command:
>> hdfs fsck / -block |grep 'blk number'
>>
>> Thanks
>>
>> On 7/30/13, ch huang <justlo...@gmail.com> wrote:
>> > i do not know how to solve this,anyone can help
>> >
>> > 2013-07-30 17:28:40,953 INFO
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_458861
>> > received exce
>> > ption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException:
>> > Cannot append to a non-existent replica
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_
>> > 458861
>> > 2013-07-30 17:28:40,953 ERROR
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver
>> > error processing WRITE_BLOCK operation  src: /192.168.2.209:4421 dest:
>> /192
>> > .168.10.34:50011
>> > org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot
>> > append to a non-existent replica
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7796221171187533460_458861
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.getReplicaInfo(FsDatasetImpl.java:353)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:489)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:92)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.<init>(BlockReceiver.java:168)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451)
>> >         at
>> >
>> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:103)
>> >         at
>> >
>> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:67)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
>> >         at java.lang.Thread.run(Thread.java:662)
>> > 2013-07-30 17:28:40,978 INFO
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving
>> >
>> BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_458863
>> > src: /192.168.2
>> > .209:4423 dest: /192.168.10.34:50011
>> > 2013-07-30 17:28:40,978 INFO
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock
>> >
>> BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_458863
>> > received exc
>> > eption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException:
>> > Cannot append to a non-existent replica
>> > BP-1099828917-192.168.10.22-1373361366827:blk_-205789402477599299
>> > 3_458863
>> > 2013-07-30 17:28:40,978 ERROR
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver
>> > error processing WRITE_BLOCK operation  src: /192.168.2.209:4423 dest:
>> /192
>> > .168.10.34:50011
>> > org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException: Cannot
>> > append to a non-existent replica
>> > BP-1099828917-192.168.10.22-1373361366827:blk_-2057894024775992993_45886
>> > 3
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.getReplicaInfo(FsDatasetImpl.java:353)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:489)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.fsdataset.impl.FsDatasetImpl.append(FsDatasetImpl.java:92)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.BlockReceiver.<init>(BlockReceiver.java:168)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataXceiver.writeBlock(DataXceiver.java:451)
>> >         at
>> >
>> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opWriteBlock(Receiver.java:103)
>> >         at
>> >
>> org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:67)
>> >         at
>> >
>> org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
>> >         at java.lang.Thread.run(Thread.java:662)
>> > 2013-07-30 17:28:41,002 INFO
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: Receiving
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_458865
>> > src: /192.168.2.
>> > 209:4426 dest: /192.168.10.34:50011
>> > 2013-07-30 17:28:41,002 INFO
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: opWriteBlock
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_458865
>> > received exce
>> > ption org.apache.hadoop.hdfs.server.datanode.ReplicaNotFoundException:
>> > Cannot append to a non-existent replica
>> > BP-1099828917-192.168.10.22-1373361366827:blk_7728515140810267551_
>> > 458865
>> > 2013-07-30 17:28:41,002 ERROR
>> > org.apache.hadoop.hdfs.server.datanode.DataNode: CH34:50011:DataXceiver
>> > error processing WRITE_BLOCK operation  src: /192.168.2.209:4426 dest:
>> /192
>> > .168.10.34:50011
>> >
>>
>
>

Reply via email to