Jackkaabe opened a new issue, #10084:
URL: https://github.com/apache/hudi/issues/10084

    When hudi integrates hive, an error is reported when the hive external 
table is queried;
                 Give an example(sql): 
   `select id from hive_ods_ tb_report_data_order_info_rt group by id;`
   
   `
          Error:
                 org.apache.hadoop.mapred.YarnChild: Exception running child : 
java.io.IOException: java.lang.reflect.InvocationTargetException
          at 
org.apache.hadoop.hive.io.HiveIOExceptionHandlerChain.handleRecordReaderCreationException(HiveIOExceptionHandlerChain.java:97)
          at 
org.apache.hadoop.hive.io.HiveIOExceptionHandlerUtil.handleRecordReaderCreationException(HiveIOExceptionHandlerUtil.java:57)
          at 
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:271)
          at 
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.<init>(HadoopShimsSecure.java:217)
          at 
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileInputFormatShim.getRecordReader(HadoopShimsSecure.java:345)
          at 
org.apache.hadoop.hive.ql.io.CombineHiveInputFormat.getRecordReader(CombineHiveInputFormat.java:719)
          at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:175)
          at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:444)
          at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
          at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
          at java.security.AccessController.doPrivileged(Native Method)
          at javax.security.auth.Subject.doAs(Subject.java:422)
          at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
          at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)
   Caused by: java.lang.reflect.InvocationTargetException
          at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native 
Method)
          at 
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
          at 
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
          at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
          at 
org.apache.hadoop.hive.shims.HadoopShimsSecure$CombineFileRecordReader.initNextRecordReader(HadoopShimsSecure.java:257)
          ... 11 more
   Caused by: java.lang.IllegalArgumentException: HoodieRealtimeRecordReader 
can only work on RealtimeSplit and not with 
hdfs://sgsdatacluster/user/hudi/warehouse/test_hudi/tb_report_data_order_info/5e08ca88-76ed-492e-8b01-ba4a6ae2f8b9_0-1-0_20230915155733914.parquet:0+796908
          at 
org.apache.hudi.common.util.ValidationUtils.checkArgument(ValidationUtils.java:40)
          at 
org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:61)
          at 
org.apache.hadoop.hive.ql.io.CombineHiveRecordReader.<init>(CombineHiveRecordReader.java:99)
          ... 16 more`
   
          Then configure the hive client:
   `set 
hive.input.format=org.apache.hudi.hadoop.hive.HoodieCombineHiveInputFormat; 
   set hoodie.hudimor.consume.mode=INCREMENTAL;
   set hoodie.hudimor.consume.max.commits=-1;`
   
          `Error:
                 org.apache.hadoop.mapred.YarnChild: Error running child : 
java.lang.NoSuchMethodError: 
org.apache.parquet.schema.Types$PrimitiveBuilder.as(Lorg/apache/parquet/schema/LogicalTypeAnnotation;)Lorg/apache/parquet/schema/Types$Builder;
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertField(AvroSchemaConverter.java:177)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertUnion(AvroSchemaConverter.java:242)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertField(AvroSchemaConverter.java:199)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertField(AvroSchemaConverter.java:152)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertField(AvroSchemaConverter.java:260)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convertFields(AvroSchemaConverter.java:146)
          at 
org.apache.parquet.avro.AvroSchemaConverter.convert(AvroSchemaConverter.java:137)
          at 
org.apache.hudi.common.table.TableSchemaResolver.readSchemaFromLogFile(TableSchemaResolver.java:485)
          at 
org.apache.hudi.common.table.TableSchemaResolver.readSchemaFromLogFile(TableSchemaResolver.java:468)
          at 
org.apache.hudi.common.table.TableSchemaResolver.fetchSchemaFromFiles(TableSchemaResolver.java:604)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableParquetSchemaFromDataFile(TableSchemaResolver.java:251)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchemaFromDataFile(TableSchemaResolver.java:117)
          at 
org.apache.hudi.common.table.TableSchemaResolver.hasOperationField(TableSchemaResolver.java:537)
          at org.apache.hudi.util.Lazy.get(Lazy.java:53)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableSchemaFromLatestCommitMetadata(TableSchemaResolver.java:208)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchemaInternal(TableSchemaResolver.java:176)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchema(TableSchemaResolver.java:138)
          at 
org.apache.hudi.common.table.TableSchemaResolver.getTableAvroSchema(TableSchemaResolver.java:127)
          at 
org.apache.hudi.hadoop.realtime.AbstractRealtimeRecordReader.init(AbstractRealtimeRecordReader.java:90)
          at 
org.apache.hudi.hadoop.realtime.AbstractRealtimeRecordReader.<init>(AbstractRealtimeRecordReader.java:72)
          at 
org.apache.hudi.hadoop.realtime.RealtimeCompactedRecordReader.<init>(RealtimeCompactedRecordReader.java:62)
          at 
org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.constructRecordReader(HoodieRealtimeRecordReader.java:70)
          at 
org.apache.hudi.hadoop.realtime.HoodieRealtimeRecordReader.<init>(HoodieRealtimeRecordReader.java:47)
          at 
org.apache.hudi.hadoop.realtime.HoodieParquetRealtimeInputFormat.getRecordReader(HoodieParquetRealtimeInputFormat.java:74)
          at 
org.apache.hadoop.hive.ql.io.HiveInputFormat.getRecordReader(HiveInputFormat.java:417)
          at 
org.apache.hadoop.mapred.MapTask$TrackedRecordReader.<init>(MapTask.java:175)
          at org.apache.hadoop.mapred.MapTask.runOldMapper(MapTask.java:444)
          at org.apache.hadoop.mapred.MapTask.run(MapTask.java:349)
          at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:174)
          at java.security.AccessController.doPrivileged(Native Method)
          at javax.security.auth.Subject.doAs(Subject.java:422)
          at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1729)
          at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:168)`
          
   Public information:
   Hudi version : 0.12.1
   Hive version : 3.1.2
   Flink version : 1.13.6
   
   What can I do to avoid this mistake?
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@hudi.apache.org.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to