Robert Hou created DRILL-6569:
---------------------------------

             Summary: Jenkins Regression: TPCDS query 19 fails with 
INTERNAL_ERROR ERROR: Can not read value at 2 in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_13_1.parquet
                 Key: DRILL-6569
                 URL: https://issues.apache.org/jira/browse/DRILL-6569
             Project: Apache Drill
          Issue Type: Bug
          Components: Execution - Relational Operators
    Affects Versions: 1.14.0
            Reporter: Robert Hou
            Assignee: Pritesh Maker
             Fix For: 1.14.0


This is TPCDS Query 19.

Query: 
/root/drillAutomation/framework-master/framework/resources/Advanced/tpcds/tpcds_sf100/hive/parquet/query19.sql

SELECT i_brand_id              brand_id,
i_brand                 brand,
i_manufact_id,
i_manufact,
Sum(ss_ext_sales_price) ext_price
FROM   date_dim,
store_sales,
item,
customer,
customer_address,
store
WHERE  d_date_sk = ss_sold_date_sk
AND ss_item_sk = i_item_sk
AND i_manager_id = 38
AND d_moy = 12
AND d_year = 1998
AND ss_customer_sk = c_customer_sk
AND c_current_addr_sk = ca_address_sk
AND Substr(ca_zip, 1, 5) <> Substr(s_zip, 1, 5)
AND ss_store_sk = s_store_sk
GROUP  BY i_brand,
i_brand_id,
i_manufact_id,
i_manufact
ORDER  BY ext_price DESC,
i_brand,
i_brand_id,
i_manufact_id,
i_manufact
LIMIT 100;

Here is the stack trace:
2018-06-29 07:00:32 INFO  DrillTestLogger:348 - 
Exception:

java.sql.SQLException: INTERNAL_ERROR ERROR: Can not read value at 2 in block 0 
in file maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_13_1.parquet

Fragment 4:26

[Error Id: 6401a71e-7a5d-4a10-a17c-16873fc3239b on atsqa6c88.qa.lab:31010]

  (hive.org.apache.parquet.io.ParquetDecodingException) Can not read value at 2 
in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_13_1.parquet
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():243
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():199
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():57
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.hasNextValue():417
    org.apache.drill.exec.store.hive.readers.HiveParquetReader.next():54
    org.apache.drill.exec.physical.impl.ScanBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.SingleSenderCreator$SingleSenderRootExec.innerNext():93
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.UnsupportedOperationException) 
org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$8$1
    hive.org.apache.parquet.io.api.PrimitiveConverter.addInt():101
    hive.org.apache.parquet.column.impl.ColumnReaderImpl$2$3.writeValue():254
    
hive.org.apache.parquet.column.impl.ColumnReaderImpl.writeCurrentValueToConverter():371
    hive.org.apache.parquet.io.RecordReaderImplementation.read():405
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():218
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():199
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():57
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.hasNextValue():417
    org.apache.drill.exec.store.hive.readers.HiveParquetReader.next():54
    org.apache.drill.exec.physical.impl.ScanBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.SingleSenderCreator$SingleSenderRootExec.innerNext():93
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748

        at 
org.apache.drill.jdbc.impl.DrillCursor.nextRowInternally(DrillCursor.java:528)
        at 
org.apache.drill.jdbc.impl.DrillCursor.loadInitialSchema(DrillCursor.java:600)
        at 
org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:1904)
        at 
org.apache.drill.jdbc.impl.DrillResultSetImpl.execute(DrillResultSetImpl.java:64)
        at 
oadd.org.apache.calcite.avatica.AvaticaConnection$1.execute(AvaticaConnection.java:630)
        at 
org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1109)
        at 
org.apache.drill.jdbc.impl.DrillMetaImpl.prepareAndExecute(DrillMetaImpl.java:1120)
        at 
oadd.org.apache.calcite.avatica.AvaticaConnection.prepareAndExecuteInternal(AvaticaConnection.java:638)
        at 
org.apache.drill.jdbc.impl.DrillConnectionImpl.prepareAndExecuteInternal(DrillConnectionImpl.java:200)
        at 
oadd.org.apache.calcite.avatica.AvaticaStatement.executeInternal(AvaticaStatement.java:149)
        at 
oadd.org.apache.calcite.avatica.AvaticaStatement.executeQuery(AvaticaStatement.java:218)
        at 
org.apache.drill.jdbc.impl.DrillStatementImpl.executeQuery(DrillStatementImpl.java:110)
        at 
org.apache.drill.test.framework.DrillTestJdbc.executeQuery(DrillTestJdbc.java:210)
        at 
org.apache.drill.test.framework.DrillTestJdbc.run(DrillTestJdbc.java:115)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:748)
Caused by: oadd.org.apache.drill.common.exceptions.UserRemoteException: 
INTERNAL_ERROR ERROR: Can not read value at 2 in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_13_1.parquet

Fragment 4:26

[Error Id: 6401a71e-7a5d-4a10-a17c-16873fc3239b on atsqa6c88.qa.lab:31010]

  (hive.org.apache.parquet.io.ParquetDecodingException) Can not read value at 2 
in block 0 in file 
maprfs:///drill/testdata/tpcds_sf100/parquet/store_sales/1_13_1.parquet
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():243
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():199
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():57
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.hasNextValue():417
    org.apache.drill.exec.store.hive.readers.HiveParquetReader.next():54
    org.apache.drill.exec.physical.impl.ScanBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.SingleSenderCreator$SingleSenderRootExec.innerNext():93
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748
  Caused By (java.lang.UnsupportedOperationException) 
org.apache.hadoop.hive.ql.io.parquet.convert.ETypeConverter$8$1
    hive.org.apache.parquet.io.api.PrimitiveConverter.addInt():101
    hive.org.apache.parquet.column.impl.ColumnReaderImpl$2$3.writeValue():254
    
hive.org.apache.parquet.column.impl.ColumnReaderImpl.writeCurrentValueToConverter():371
    hive.org.apache.parquet.io.RecordReaderImplementation.read():405
    
hive.org.apache.parquet.hadoop.InternalParquetRecordReader.nextKeyValue():218
    hive.org.apache.parquet.hadoop.ParquetRecordReader.nextKeyValue():227
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():199
    
org.apache.hadoop.hive.ql.io.parquet.read.ParquetRecordReaderWrapper.next():57
    
org.apache.drill.exec.store.hive.readers.HiveAbstractReader.hasNextValue():417
    org.apache.drill.exec.store.hive.readers.HiveParquetReader.next():54
    org.apache.drill.exec.physical.impl.ScanBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.sniffNonEmptyBatch():276
    
org.apache.drill.exec.physical.impl.join.HashJoinBatch.prefetchFirstBatchFromBothSides():238
    org.apache.drill.exec.physical.impl.join.HashJoinBatch.buildSchema():218
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.physical.impl.aggregate.HashAggBatch.buildSchema():118
    org.apache.drill.exec.record.AbstractRecordBatch.next():152
    org.apache.drill.exec.record.AbstractRecordBatch.next():119
    org.apache.drill.exec.record.AbstractRecordBatch.next():109
    org.apache.drill.exec.record.AbstractUnaryRecordBatch.innerNext():63
    
org.apache.drill.exec.physical.impl.project.ProjectRecordBatch.innerNext():147
    org.apache.drill.exec.record.AbstractRecordBatch.next():172
    org.apache.drill.exec.physical.impl.BaseRootExec.next():103
    
org.apache.drill.exec.physical.impl.SingleSenderCreator$SingleSenderRootExec.innerNext():93
    org.apache.drill.exec.physical.impl.BaseRootExec.next():93
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():294
    org.apache.drill.exec.work.fragment.FragmentExecutor$1.run():281
    java.security.AccessController.doPrivileged():-2
    javax.security.auth.Subject.doAs():422
    org.apache.hadoop.security.UserGroupInformation.doAs():1595
    org.apache.drill.exec.work.fragment.FragmentExecutor.run():281
    org.apache.drill.common.SelfCleaningRunnable.run():38
    java.util.concurrent.ThreadPoolExecutor.runWorker():1149
    java.util.concurrent.ThreadPoolExecutor$Worker.run():624
    java.lang.Thread.run():748

        at 
oadd.org.apache.drill.exec.rpc.user.QueryResultHandler.resultArrived(QueryResultHandler.java:123)
        at 
oadd.org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:422)
        at 
oadd.org.apache.drill.exec.rpc.user.UserClient.handle(UserClient.java:96)
        at 
oadd.org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:274)
        at 
oadd.org.apache.drill.exec.rpc.RpcBus$InboundHandler.decode(RpcBus.java:244)
        at 
oadd.io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:88)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:287)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.handler.codec.ByteToMessageDecoder.fireChannelRead(ByteToMessageDecoder.java:312)
        at 
oadd.io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:286)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:335)
        at 
oadd.io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1294)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:356)
        at 
oadd.io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:342)
        at 
oadd.io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:911)
        at 
oadd.io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
        at 
oadd.io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
        at oadd.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
        at 
oadd.io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:131)
        ... 1 more

The commit id is:
1.14.0-SNAPSHOT 140d09e69b65ac2cb1bed09a37fa5861d39a99b3 DRILL-6539: Record 
count not set for this vector container error 28.06.2018 @ 16:13:20 PDT Unknown 
28.06.2018 @ 16:21:42 PDT



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

Reply via email to