While putting data in Hbase, using the HTable.put method, I'll end up with
the below exception occasionally. But the data has been actually written to
Hbase when I checked the get operation for that particular rowkey.

For the same time I have searched for the logs in both HMaster and
HRegionservers to identify the issue. But unable to find that.

hbase.client.* configurations has default value only.

Please help to fine tune Hbase Configurations in order to avoid
InterruptedIOException.

Hadoop Distribution: ApacheVersion: HBase 1.2.6Cluster size: 12nodes



java.io.InterruptedIOException: #17209, interrupted. currentNumberOfTask=1
    at 
org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1764)
    at 
org.apache.hadoop.hbase.client.AsyncProcess.waitForMaximumCurrentTasks(AsyncProcess.java:1734)
    at 
org.apache.hadoop.hbase.client.AsyncProcess.waitForAllPreviousOpsAndReset(AsyncProcess.java:1810)
    at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.backgroundFlushCommits(BufferedMutatorImpl.java:240)
    at 
org.apache.hadoop.hbase.client.BufferedMutatorImpl.flush(BufferedMutatorImpl.java:190)
    at org.apache.hadoop.hbase.client.HTable.flushCommits(HTable.java:1434)
    at org.apache.hadoop.hbase.client.HTable.put(HTable.java:1018)

Please help to solve it

The same exception has been faced by someone . But in that thread, there is
no explanation about which are configurations need to be checked in order
to avoid it

https://groups.google.com/forum/#!topic/nosql-databases/UxfrmWl_ZnM
*Regards,*
*Manimekalai K*

Reply via email to