Uninitialized Message Exception thrown while getting values.

2018-01-17 Thread Karthick Ram
"UninitializedMessageException : Message missing required fields : region,
get", is thrown while performing Get. Due to this all the Get requests to
the same Region Server are getting stalled.

com.google.protobuf.UninitializedMessageException: Message missing required
fields : region, get
at
com.google.protobuf.AbstractMessage$Build.newUninitializedMessageException(AbstractMessage.java:770)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetRequest$Builder.build(ClientProtos.java:6377)
at
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$GetRequest$Builder.build(ClientProtos.java:6309)
at
org.apache.hadoop.hbase.ipc.RpcServer$Connection.processRequest(RpcServer.java:1840)
at
org.apache.hadoop.hbase.ipc.RpcServer$Connection.processOneRpc(RpcServer.java:1775)
at
org.apache.hadoop.hbase.ipc.RpcServer$Connection.process(RPcServer.java:1623)
at
org.apache.hadoop.hbase.ipc.RpcServer$Connection.readAndProcess(RpcServer.java:1603)
at org.apache.hadoop.hbase.ipc.RpcServer$Listener.doRead(RpcServer.java:861)
at
org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.doRunLoop(RpcServer.java:643)
at
org.apache.hadoop.hbase.ipc.RpcServer$Listener$Reader.run(RpcServer.java:619)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)


Empty byte array values for cells which doesn't exist

2017-09-24 Thread Karthick Ram
Hi, we have a table which when queried with more than one columkey for a
row key (using addcolumn(colfamily, columkey)), returns cells which are not
present. It returns a empty byte array as value for those cells. Using
debugger we found the timestamp of those cells to *'OLDEST_TIMESTAMP'* and
the type to be *'Minimum'*. These cells turns out to be fake cells, however
when queried with only columnkey it doesn't return any cells. Please look
into the following files and suggest some ways to rectify this problem.
1. HConstant.java
2. KeyValue.java
3. ScanQueryMatcher.java
4. StoreFileScanner.java

NOTE:
We are not able to reproduce the same in other tables.


Encryption of exisiting data in Stripe Compaction

2017-06-14 Thread Karthick Ram
We have a table which has time series data with Stripe Compaction enabled.
After encryption has been enabled for this table the newer entries are
encrypted and inserted. However to encrypt the existing data in the table,
a major compaction has to run. Since, stripe compaction doesn't allow a
major compaction to run, we are unable to encrypt the previous data. Please
suggest some ways to rectify this problem.

Regards,
Karthick R