issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread ch huang
hi,maillist:
  i see the following info in my hdfs log ,and the block belong to
the file which write by scribe ,i do not know why
is there any limit in hdfs system ?

2014-02-21 10:33:30,235 INFO
org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
received exc
eption java.io.IOException: Replica gen stamp  block genstamp,
block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
replica=ReplicaWaitingToBeRecov
ered, blk_-8536558734938003208_3820986, RWR
  getNumBytes() = 35840
  getBytesOnDisk()  = 35840
  getVisibleLength()= -1
  getVolume()   = /data/4/dn/current
  getBlockFile()=
/data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
  unlinked=false
2014-02-21 10:33:30,235 WARN
org.apache.hadoop.hdfs.server.datanode.DataNode:
DatanodeRegistration(192.168.11.12,
storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
ort=50075, ipcPort=50020,
storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
exception while serving BP-1043055049-192.168.11.11-1382442676
609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
java.io.IOException: Replica gen stamp  block genstamp,
block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
replica=ReplicaWaitingToBeRecovered, b
lk_-8536558734938003208_3820986, RWR
  getNumBytes() = 35840
  getBytesOnDisk()  = 35840
  getVisibleLength()= -1
  getVolume()   = /data/4/dn/current
  getBlockFile()=
/data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
  unlinked=false
at
org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
at java.lang.Thread.run(Thread.java:744)
2014-02-21 10:33:30,236 ERROR
org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
192.168.11.12:50010
java.io.IOException: Replica gen stamp  block genstamp,
block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
  getNumBytes() = 35840
  getBytesOnDisk()  = 35840
  getVisibleLength()= -1
  getVolume()   = /data/4/dn/current
  getBlockFile()=
/data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
  unlinked=false
at
org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
at
org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
at
org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
at java.lang.Thread.run(Thread.java:744)


Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread Ted Yu
Which hadoop release are you using ?

Cheers


On Thu, Feb 20, 2014 at 8:57 PM, ch huang justlo...@gmail.com wrote:

 hi,maillist:
   i see the following info in my hdfs log ,and the block belong to
 the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?

 2014-02-21 10:33:30,235 INFO
 org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN
 org.apache.hadoop.hdfs.server.datanode.DataNode:
 DatanodeRegistration(192.168.11.12,
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020,
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
 exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
 error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
 192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)



Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread Anurag Tangri
Did you check your unix open file limit and data node xceiver value ?

Is it too low for the number of blocks/data in your cluster ? 

Thanks,
Anurag Tangri

 On Feb 20, 2014, at 6:57 PM, ch huang justlo...@gmail.com wrote:
 
 hi,maillist:
   i see the following info in my hdfs log ,and the block belong to 
 the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?
  
 2014-02-21 10:33:30,235 INFO org.apache.hadoop.hdfs.server.datanode.DataNode: 
 opReadBlock 
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240 
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp, 
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
  replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()= 
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN org.apache.hadoop.hdfs.server.datanode.DataNode: 
 DatanodeRegistration(192.168.11.12, 
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020, 
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
  exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp, 
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
  replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()= 
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at 
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at 
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at 
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at 
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at 
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR 
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error 
 processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: 
 /192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp, 
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
  replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()= 
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at 
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at 
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at 
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at 
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at 
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)


Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread ch huang
hi, i use CDH4.4

On Fri, Feb 21, 2014 at 12:04 PM, Ted Yu yuzhih...@gmail.com wrote:

 Which hadoop release are you using ?

 Cheers


 On Thu, Feb 20, 2014 at 8:57 PM, ch huang justlo...@gmail.com wrote:

  hi,maillist:
   i see the following info in my hdfs log ,and the block belong
 to the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?

 2014-02-21 10:33:30,235 INFO
 org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN
 org.apache.hadoop.hdfs.server.datanode.DataNode:
 DatanodeRegistration(192.168.11.12,
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020,
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
 exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
 error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
 192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)





Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread ch huang
i use default value it seems the value is 4096,

and also i checked hdfs user limit ,it's large enough

-bash-4.1$ ulimit -a
core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 514914
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 32768
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 10240
cpu time   (seconds, -t) unlimited
max user processes  (-u) 65536
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited


On Fri, Feb 21, 2014 at 12:25 PM, Anurag Tangri anurag_tan...@yahoo.comwrote:

  Did you check your unix open file limit and data node xceiver value ?

 Is it too low for the number of blocks/data in your cluster ?

 Thanks,
 Anurag Tangri

 On Feb 20, 2014, at 6:57 PM, ch huang justlo...@gmail.com wrote:

   hi,maillist:
   i see the following info in my hdfs log ,and the block belong to
 the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?

 2014-02-21 10:33:30,235 INFO
 org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN
 org.apache.hadoop.hdfs.server.datanode.DataNode:
 DatanodeRegistration(192.168.11.12,
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020,
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
 exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
 error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
 192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)




Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread ch huang
one more question is if i need add the value of data node xceiver
need i add it to my NN config file?



On Fri, Feb 21, 2014 at 12:25 PM, Anurag Tangri anurag_tan...@yahoo.comwrote:

  Did you check your unix open file limit and data node xceiver value ?

 Is it too low for the number of blocks/data in your cluster ?

 Thanks,
 Anurag Tangri

 On Feb 20, 2014, at 6:57 PM, ch huang justlo...@gmail.com wrote:

   hi,maillist:
   i see the following info in my hdfs log ,and the block belong to
 the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?

 2014-02-21 10:33:30,235 INFO
 org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN
 org.apache.hadoop.hdfs.server.datanode.DataNode:
 DatanodeRegistration(192.168.11.12,
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020,
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
 exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
 error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
 192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)




Re: issue about write append into hdfs ERROR org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver error processing READ_BLOCK operation

2014-02-20 Thread ch huang
i changed all datanode config add dfs.datanode.max.xcievers value is 131072
and restart all DN, still no use

On Fri, Feb 21, 2014 at 12:25 PM, Anurag Tangri anurag_tan...@yahoo.comwrote:

  Did you check your unix open file limit and data node xceiver value ?

 Is it too low for the number of blocks/data in your cluster ?

 Thanks,
 Anurag Tangri

 On Feb 20, 2014, at 6:57 PM, ch huang justlo...@gmail.com wrote:

   hi,maillist:
   i see the following info in my hdfs log ,and the block belong to
 the file which write by scribe ,i do not know why
 is there any limit in hdfs system ?

 2014-02-21 10:33:30,235 INFO
 org.apache.hadoop.hdfs.server.datanode.DataNode: opReadBlock
 BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240
 received exc
 eption java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecov
 ered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 2014-02-21 10:33:30,235 WARN
 org.apache.hadoop.hdfs.server.datanode.DataNode:
 DatanodeRegistration(192.168.11.12,
 storageID=DS-754202132-192.168.11.12-50010-1382443087835, infoP
 ort=50075, ipcPort=50020,
 storageInfo=lv=-40;cid=CID-0e777b8c-19f3-44a1-8af1-916877f2506c;nsid=2086828354;c=0):Got
 exception while serving BP-1043055049-192.168.11.11-1382442676
 609:blk_-8536558734938003208_3823240 to /192.168.11.15:56564
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, b
 lk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)
 2014-02-21 10:33:30,236 ERROR
 org.apache.hadoop.hdfs.server.datanode.DataNode: ch12:50010:DataXceiver
 error processing READ_BLOCK operation  src: /192.168.11.15:56564 dest: /
 192.168.11.12:50010
 java.io.IOException: Replica gen stamp  block genstamp,
 block=BP-1043055049-192.168.11.11-1382442676609:blk_-8536558734938003208_3823240,
 replica=ReplicaWaitingToBeRecovered, blk_-8536558734938003208_3820986, RWR
   getNumBytes() = 35840
   getBytesOnDisk()  = 35840
   getVisibleLength()= -1
   getVolume()   = /data/4/dn/current
   getBlockFile()=
 /data/4/dn/current/BP-1043055049-192.168.11.11-1382442676609/current/rbw/blk_-8536558734938003208
   unlinked=false
 at
 org.apache.hadoop.hdfs.server.datanode.BlockSender.init(BlockSender.java:205)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.readBlock(DataXceiver.java:326)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.opReadBlock(Receiver.java:92)
 at
 org.apache.hadoop.hdfs.protocol.datatransfer.Receiver.processOp(Receiver.java:64)
 at
 org.apache.hadoop.hdfs.server.datanode.DataXceiver.run(DataXceiver.java:221)
 at java.lang.Thread.run(Thread.java:744)