Hi,

  Env: hbase-1.1.4
  Phoenix: 4.10
   I am querying a very simple table in phoenix

   CREATE TABLE IF NOT EXISTS BCM.PATH ( PATH VARCHAR NOT NULL, IS_BRANCH 
BOOLEAN CONSTRAINT BCM_path_pk PRIMARY KEY (PATH)) COMPRESSION='SNAPPY', 
DATA_BLOCK_ENCODING='FAST_DIFF', VERSIONS=1000, KEEP_DELETED_CELLS=true "

  And run this query

   Select count(1) from bcm.path

Some time, not always, it give me error.   This one is a small table with about 
100K records.  And with nothing changed. It sometimes give right result.

org.apache.phoenix.exception.PhoenixIOException: 
org.apache.hadoop.hbase.DoNotRetryIOException: 
BCM:PATH,,1520015853032.88b3078dd61c0aaab1692fdc5d561dc2.: null
                at 
org.apache.phoenix.util.ServerUtil.createIOException(ServerUtil.java:89)
                at 
org.apache.phoenix.util.ServerUtil.throwIOException(ServerUtil.java:55)
                at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.overrideDelegate(BaseScannerRegionObserver.java:256)
                at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.nextRaw(BaseScannerRegionObserver.java:282)
                at 
org.apache.hadoop.hbase.regionserver.RSRpcServices.scan(RSRpcServices.java:2448)
                at 
org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:32385)
                at 
org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2117)
                at 
org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:104)
                at 
org.apache.hadoop.hbase.ipc.RpcExecutor.consumerLoop(RpcExecutor.java:133)
                at 
org.apache.hadoop.hbase.ipc.RpcExecutor$1.run(RpcExecutor.java:108)
                at java.lang.Thread.run(Thread.java:745)
Caused by: java.lang.UnsupportedOperationException
                at 
org.apache.phoenix.schema.PTable$QualifierEncodingScheme$1.decode(PTable.java:243)
                at 
org.apache.phoenix.schema.tuple.EncodedColumnQualiferCellsList.add(EncodedColumnQualiferCellsList.java:136)
                at 
org.apache.phoenix.schema.tuple.EncodedColumnQualiferCellsList.add(EncodedColumnQualiferCellsList.java:55)
                at 
org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:573)
                at 
org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
                at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5516)
                at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:5667)
                at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5454)
                at 
org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5440)
                at 
org.apache.phoenix.coprocessor.UngroupedAggregateRegionObserver.doPostScannerOpen(UngroupedAggregateRegionObserver.java:497)
                at 
org.apache.phoenix.coprocessor.BaseScannerRegionObserver$RegionScannerHolder.overrideDelegate(BaseScannerRegionObserver.java:237)

  any hints? I don't see any log from region server looks related.

Thanks,
Nan

----------------------------------------------------------------------
This message, and any attachments, is for the intended recipient(s) only, may 
contain information that is privileged, confidential and/or proprietary and 
subject to important terms and conditions available at 
http://www.bankofamerica.com/emaildisclaimer.   If you are not the intended 
recipient, please delete this message.

Reply via email to