The ways that you can lose data in my point of views:
1. some tuples share the same row-key+cf+column. Hence, when you load your
data in HBase, they will be loaded into the same column and may exceed the
predefined max version.
2. As Ted mentioned, you may import some delete, do you generate
Hi hbase users,
We got an issue when import data from thrift (perl)
We found the number of data is less than expected.
when scan the table, we got:
ERROR: java.lang.RuntimeException:
org.apache.hadoop.hbase.client.RetriesExhaustedException: Failed after
attempts=7, exceptions:
Tue Jul 23
Which HBase release are you using ?
Was it possible that the import included Delete's ?
Cheers
On Tue, Jul 23, 2013 at 5:23 PM, Huangmao (Homer) Quan luj...@gmail.comwrote:
Hi hbase users,
We got an issue when import data from thrift (perl)
We found the number of data is less than