Thank you for taking a look. This specific issue with fast diff encoding is
the first time we've found it, though perhaps it had happened in the past
as well (we did encounter a couple of these IndexOutOfBound, or
BufferOverflow issues in the past, but had dropped data and moved on back
then). In the past we had issues with data becoming corrupted and no longer
being in a format Phoenix expects from time to time, but in those cases we
could always read at the HBase level unlike this time.

- Will
On Mon, Dec 10, 2018 at 9:19 AM Stack <[email protected]> wrote:

> Thank you William Shen. Looks like a corruption (either in the writing or
> subsequent). Does it happen frequently?
> Thanks,
> S
>
> On Thu, Dec 6, 2018 at 12:24 PM William Shen <[email protected]>
> wrote:
>
> > I have created https://issues.apache.org/jira/browse/HBASE-21563 in case
> > anyone else is able to give a try at reading the HFile. Thank you!
> >
> > On Wed, Dec 5, 2018 at 3:24 PM William Shen <[email protected]>
> > wrote:
> >
> > > In addition, when running hbase hfile -f -p, kv pairs were printed
> until
> > > the program hit the following exception:
> > >
> > > Exception in thread "main" java.lang.RuntimeException: Unknown code 65
> > >
> > > at org.apache.hadoop.hbase.KeyValue$Type.codeToType(KeyValue.java:259)
> > >
> > > at org.apache.hadoop.hbase.KeyValue.keyToString(KeyValue.java:1246)
> > >
> > > at
> > > org.apache.hadoop.hbase.io
> >
> .encoding.BufferedDataBlockEncoder$ClonedSeekerState.toString(BufferedDataBlockEncoder.java:506)
> > >
> > > at java.lang.String.valueOf(String.java:2994)
> > >
> > > at java.lang.StringBuilder.append(StringBuilder.java:131)
> > >
> > > at
> > > org.apache.hadoop.hbase.io
> > .hfile.HFilePrettyPrinter.scanKeysValues(HFilePrettyPrinter.java:382)
> > >
> > > at
> > > org.apache.hadoop.hbase.io
> > .hfile.HFilePrettyPrinter.processFile(HFilePrettyPrinter.java:316)
> > >
> > > at
> > > org.apache.hadoop.hbase.io
> > .hfile.HFilePrettyPrinter.run(HFilePrettyPrinter.java:255)
> > >
> > > at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70)
> > >
> > > at
> > > org.apache.hadoop.hbase.io
> > .hfile.HFilePrettyPrinter.main(HFilePrettyPrinter.java:677)
> > >
> > > On Wed, Dec 5, 2018 at 2:24 PM William Shen <
> [email protected]>
> > > wrote:
> > >
> > >> Thank you Stack.
> > >> I was able to isolate the specific Hfile causing the exception. Do you
> > >> mind teaching me how to play with the file standalone? I am not sure
> if
> > I
> > >> know how to do that.
> > >> Thanks!
> > >>
> > >> On Wed, Dec 5, 2018 at 1:04 PM Stack <[email protected]> wrote:
> > >>
> > >>> Looks like bug in FastDiffDeltaEncoder triggered by whatever the
> > current
> > >>> form of the target file. Can you figure which file it is (going by
> the
> > >>> Get
> > >>> coordinates?). I suppose the compactor is running into the same
> problem
> > >>> (was thinking a major compaction might get you over this hump). You
> > could
> > >>> make a copy of the problematic file and play with it standalone to
> see
> > if
> > >>> can figure the bug. Failing that, post to a JIRA if you yourself
> can't
> > >>> figure it so someone else might have a go at it?
> > >>>
> > >>> Thanks,
> > >>> S
> > >>>
> > >>> On Wed, Dec 5, 2018 at 11:22 AM William Shen <
> > [email protected]
> > >>> >
> > >>> wrote:
> > >>>
> > >>> > Hi there,
> > >>> >
> > >>> > We've recently encountered issue retrieving data from our HBase
> > >>> cluster,
> > >>> > and have not had much luck troubleshooting the issue. We narrowed
> > down
> > >>> our
> > >>> > issue to a single GET, which appears to be caused by
> > >>> > FastDiffDeltaEncoder.java running into
> > >>> java.lang.IndexOutOfBoundsException.
> > >>> > Has anyone encountered similar issues before, or does anyone have
> > >>> > experience troubleshooting issues such as this one? Any help would
> be
> > >>> much
> > >>> > appreciated! We are running 1.2.0-cdh5.9.2, and the GET in question
> > is:
> > >>> >
> > >>> > hbase(main):004:0> get 'qa2.ADGROUPS',
> > >>> >
> > >>> >
> > >>>
> >
> "\x05\x80\x00\x00\x00\x00\x1F\x54\x9C\x80\x00\x00\x00\x00\x1C\x7D\x45\x00\x04\x80\x00\x00\x00\x00\x1D\x0F\x19\x80\x00\x00\x00\x00\x4A\x64\x6F\x80\x00\x00\x00\x01\xD9\xDB\xCE"
> > >>> >
> > >>> > COLUMN                                                CELL
> > >>> >
> > >>> >
> > >>> >
> > >>> >
> > >>> > ERROR: java.io.IOException
> > >>> >
> > >>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2215)
> > >>> >
> > >>> > at org.apache.hadoop.hbase.ipc.CallRunner.run(CallRunner.java:109)
> > >>> >
> > >>> > at
> > >>> >
> > >>>
> > org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:185)
> > >>> >
> > >>> > at
> > >>> >
> > >>>
> > org.apache.hadoop.hbase.ipc.RpcExecutor$Handler.run(RpcExecutor.java:165)
> > >>> >
> > >>> > Caused by: java.lang.IndexOutOfBoundsException
> > >>> >
> > >>> > at java.nio.Buffer.checkBounds(Buffer.java:567)
> > >>> >
> > >>> > at java.nio.HeapByteBuffer.get(HeapByteBuffer.java:149)
> > >>> >
> > >>> > at
> > >>> > org.apache.hadoop.hbase.io
> > >>> >
> > .encoding.FastDiffDeltaEncoder$1.decode(FastDiffDeltaEncoder.java:465)
> > >>> >
> > >>> > at
> > >>> > org.apache.hadoop.hbase.io
> > >>> >
> > >>>
> >
> .encoding.FastDiffDeltaEncoder$1.decodeNext(FastDiffDeltaEncoder.java:516)
> > >>> >
> > >>> > at
> > >>> > org.apache.hadoop.hbase.io
> > >>> >
> > >>>
> >
> .encoding.BufferedDataBlockEncoder$BufferedEncodedSeeker.next(BufferedDataBlockEncoder.java:618)
> > >>> >
> > >>> > at
> > >>> > org.apache.hadoop.hbase.io
> > >>> > .hfile.HFileReaderV2$EncodedScannerV2.next(HFileReaderV2.java:1277)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.StoreFileScanner.next(StoreFileScanner.java:180)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:108)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.StoreScanner.next(StoreScanner.java:588)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.KeyValueHeap.next(KeyValueHeap.java:147)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.populateResult(HRegion.java:5706)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextInternal(HRegion.java:5865)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.nextRaw(HRegion.java:5643)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:5620)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.HRegion$RegionScannerImpl.next(HRegion.java:5606)
> > >>> >
> > >>> > at
> > org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6801)
> > >>> >
> > >>> > at
> > org.apache.hadoop.hbase.regionserver.HRegion.get(HRegion.java:6779)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.regionserver.RSRpcServices.get(RSRpcServices.java:2029)
> > >>> >
> > >>> > at
> > >>> >
> > >>> >
> > >>>
> >
> org.apache.hadoop.hbase.protobuf.generated.ClientProtos$ClientService$2.callBlockingMethod(ClientProtos.java:33644)
> > >>> >
> > >>> > at org.apache.hadoop.hbase.ipc.RpcServer.call(RpcServer.java:2170)
> > >>> >
> > >>> > ... 3 more
> > >>> >
> > >>> >
> > >>> > Thank you very much in advance!
> > >>> >
> > >>>
> > >>
> >
>

Reply via email to