Chris,

Actually, now I am seeing a non-deterministic IOOBE like yours (with length = 
1).

Note in the call stack way below that it's coming from an isNull() method.  The 
isNull() method was called with an index of 0 when the top-level vector 
container or whatever had one row.

*It looks like the subvector used to track null values didn't get filled in 
right. *(I can't tell yet if it also means that the value-printing code in the 
HBase test is missing something about a schema change.)

This is from an HBase query with "WHERE row_key = 'a2' or row_key between 'b5' and 'b6'".  A batch 
of 2 rows resulting from the "between ..." part comes first--*non-deterministically* (it was 
consistent for many runs, but it switched just now in a run for adding the schemas to this message)--and then 
the batch of one row for the "= 'a2'" part seems messed up in the second column family (f2):

The first batch's schema is (note *f2*'s type):
BatchSchema [fields=[`row_key`(VARBINARY:REQUIRED)[`$offsets$`(UINT4:REQUIRED)], `f`(MAP:REQUIRED)[`f`.`c1`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c1`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c2`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c2`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c3`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c3`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c4`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c4`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c5`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c5`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c6`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c6`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f`.`c8`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f`.`c8`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]]], `f2`(MAP:REQUIRED)[`f2`.`c1`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`.`c1`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f2`.`c3`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`.`c3`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f2`.`c5`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`.`c5`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f2`.`c7`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`.`c7`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], `f2`.`c9`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`.`c9`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]]]], selectionVector=NONE]

The second batch's schema is (*note f2's type*):
BatchSchema 
[fields=[`row_key`(VARBINARY:REQUIRED)[`$offsets$`(UINT4:REQUIRED)], 
`f`(MAP:REQUIRED)[`f`.`c1`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c1`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], 
`f`.`c2`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c2`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], 
`f`.`c3`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c3`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], 
`f`.`c4`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c4`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], 
`f`.`c5`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c5`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]], 
`f`.`c6`(VARBINARY:OPTIONAL)[`$bits$`(UINT1:REQUIRED), 
`f`.`c6`(VARBINARY:OPTIONAL)[`$offsets$`(UINT4:REQUIRED)]]], 
`f2`(INT:OPTIONAL)[`$bits$`(UINT1:REQUIRED), `f2`(INT:OPTIONAL)]], 
selectionVector=NONE]

*Is f2's type of **INT:OPTIONAL**correct?*

Why wouldn't f2's type be MAP:REQUIRED? Even if the HBase reader didn't see any 
HBase columns in HBase column family f2, doesn't it still know that f2 is a 
column family and shouldn't it still set Drill's f2 column to be a map and not 
of type INT:OPTIONAL?



Stack trace at IndexOutOfBoundsException saying "index: 0, length: 1 (expected: 
range(0, 0))":
IndexOutOfBoundsException.<init>(String) line: 56
DrillBuf.checkIndexD(int, int) line: 189
DrillBuf.chk(int, int) line: 211
DrillBuf.getByte(int) line: 746
UInt1Vector$Accessor.get(int) line: 364
NullableVarBinaryVector$Accessor.isSet(int) line: 391
NullableVarBinaryVector$Accessor.isNull(int) line: 387
NullableVarBinaryVector$Accessor.getObject(int) line: 411
NullableVarBinaryVector$Accessor.getObject(int) line: 1
MapVector$Accessor.getObject(int) line: 313
VectorUtil.showVectorAccessibleContent(VectorAccessible, int[]) line: 167
TestHBaseFilterPushDown(BaseTestQuery).printResult(List<QueryDataBatch>) line: 
487
TestHBaseFilterPushDown(BaseHBaseTest).printResultAndVerifyRowCount(List<QueryDataBatch>,
 int) line: 95
TestHBaseFilterPushDown(BaseHBaseTest).runHBaseSQLVerifyCount(String, int) 
line: 91
TestHBaseFilterPushDown.testTEMP5() line: 796
NativeMethodAccessorImpl.invoke0(Method, Object, Object[]) line: not available 
[native method]
NativeMethodAccessorImpl.invoke(Object, Object[]) line: 57
DelegatingMethodAccessorImpl.invoke(Object, Object[]) line: 43
Method.invoke(Object, Object...) line: 606
FrameworkMethod$1.runReflectiveCall() line: 47
FrameworkMethod$1(ReflectiveCallable).run() line: 12
FrameworkMethod.invokeExplosively(Object, Object...) line: 44
JUnit4TestRunnerDecorator.executeTestMethod(FrameworkMethod, Object, Object...) 
line: 120
JUnit4TestRunnerDecorator.invokeExplosively(FrameworkMethod, Object, Object...) 
line: 65
MockFrameworkMethod.invokeExplosively(Invocation, Object, Object...) line: 29
GeneratedMethodAccessor133.invoke(Object, Object[]) line: not available
DelegatingMethodAccessorImpl.invoke(Object, Object[]) line: 43
Method.invoke(Object, Object...) line: 606
MethodReflection.invokeWithCheckedThrows(Object, Method, Object...) line: 95
MockMethodBridge.callMock(Object, boolean, String, String, String, int, int, 
boolean, Object[]) line: 76
MockMethodBridge.invoke(Object, Method, Object[]) line: 41
<unknown receiving type>(FrameworkMethod).invokeExplosively(Object, Object...) 
line: 44
InvokeMethod.evaluate() line: 17
RunBefores.evaluate() line: 26
RunAfters.evaluate() line: 27
TestWatcher$1.evaluate() line: 55
TestWatcher$1.evaluate() line: 55
TestWatcher$1.evaluate() line: 55
ExpectedException$ExpectedExceptionStatement.evaluate() line: 168
TestWatcher$1.evaluate() line: 55
RunRules.evaluate() line: 20
BlockJUnit4ClassRunner(ParentRunner<T>).runLeaf(Statement, Description, 
RunNotifier) line: 271
BlockJUnit4ClassRunner.runChild(FrameworkMethod, RunNotifier) line: 70
BlockJUnit4ClassRunner.runChild(Object, RunNotifier) line: 50
ParentRunner$3.run() line: 238
ParentRunner$1.schedule(Runnable) line: 63
BlockJUnit4ClassRunner(ParentRunner<T>).runChildren(RunNotifier) line: 236
ParentRunner<T>.access$000(ParentRunner, RunNotifier) line: 53
ParentRunner$2.evaluate() line: 229
RunBefores.evaluate() line: 26
RunAfters.evaluate() line: 27
BlockJUnit4ClassRunner(ParentRunner<T>).run(RunNotifier) line: 309
JUnit4TestClassReference(JUnit4TestReference).run(TestExecution) line: 50
TestExecution.run(ITestReference[]) line: 38
RemoteTestRunner.runTests(String[], String, TestExecution) line: 459
RemoteTestRunner.runTests(TestExecution) line: 675
RemoteTestRunner.run() line: 382
RemoteTestRunner.main(String[]) line: 192

Daniel


Chris Westin wrote:
I seem to recall you were telling me about a new IOOB that you're seeing, was 
that you?
Is is this?

Execution Failures:
/root/drillAutomation/framework-master/framework/resources/Functional/aggregates/tpcds_variants/csv/aggregate26.q
Query:
select cast(case columns[0] when '' then 0 else columns[0] end as int) as 
soldd, cast(case columns[1] when '' then 0 else columns[1] end as bigint) as 
soldt, cast(case columns[2] when '' then 0 else columns[2] end as float) as 
itemsk, cast(case columns[3] when '' then 0 else columns[3] end as 
decimal(18,9)) as custsk, cast(case columns[4] when '' then 0 else columns[4] 
end as varchar(20)) as cdemo, columns[5] as hdemo, columns[6] as addrsk, 
columns[7] as storesk, columns[8] as promo, columns[9] as tickn, sum(case 
columns[10] when '' then 0 else cast(columns[10] as int) end) as quantities 
from `store_sales.dat` group by cast(case columns[0] when '' then 0 else 
columns[0] end as int), cast(case columns[1] when '' then 0 else columns[1] end 
as bigint), cast(case columns[2] when '' then 0 else columns[2] end as float), 
cast(case columns[3] when '' then 0 else columns[3] end as decimal(18,9)), 
cast(case columns[4] when '' then 0 else columns[4] end as varchar(20)), 
columns[5], columns[6], columns[7], columns[8], columns[9] order by soldd desc, 
soldt desc, itemsk desc limit 20
Failed with exception
java.sql.SQLException: SYSTEM ERROR: IndexOutOfBoundsException: index: 0, 
length: 1 (expected: range(0, 0))

Fragment 0:0

[Error Id: 2322e296-4fff-4770-a778-c277770ea4d7 on atsqa6c61.qa.lab:31010]
        at 
org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:247)
        at org.apache.drill.jdbc.impl.DrillCursor.next(DrillCursor.java:320)
        at 
oadd.net.hydromatic.avatica.AvaticaResultSet.next(AvaticaResultSet.java:187)
        at 
org.apache.drill.jdbc.impl.DrillResultSetImpl.next(DrillResultSetImpl.java:160)
        at 
org.apache.drill.test.framework.DrillTestJdbc.executeQuery(DrillTestJdbc.java:203)
        at 
org.apache.drill.test.framework.DrillTestJdbc.run(DrillTestJdbc.java:89)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:744)
Caused by: oadd.org.apache.drill.common.exceptions.UserRemoteException: SYSTEM 
ERROR: IndexOutOfBoundsException: index: 0, length: 1 (expected: range(0, 0))

Fragment 0:0


--
Daniel Barclay
MapR Technologies

Reply via email to