[ 
https://issues.apache.org/jira/browse/FLINK-31417?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17702207#comment-17702207
 ] 

Jingsong Lee commented on FLINK-31417:
--------------------------------------

Thanks [~nonggia] , this has been fixed in 0.3, will cp to 0.4.

> Hadoop version unknown when TrinoPageSourceBase.getNextPage
> -----------------------------------------------------------
>
>                 Key: FLINK-31417
>                 URL: https://issues.apache.org/jira/browse/FLINK-31417
>             Project: Flink
>          Issue Type: Bug
>          Components: Table Store
>    Affects Versions: table-store-0.4.0
>            Reporter: nonggia.liang
>            Priority: Major
>
> Exception thrown when quering flink-table-store by trino
> {code:java}
> 2023-03-13T11:46:36.694+0800    ERROR   SplitRunner-11-113      
> io.trino.execution.executor.TaskExecutor        Error processing Split 
> 20230313_034504_00000_jdcet.1.0.0-11 {} (start = 3.599627617710298E10, wall = 
> 89264 ms, cpu = 0 ms, wait = 1 ms, calls = 1)java.lang.NoClassDefFoundError: 
> Could not initialize class 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderUtils     
>    at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderImpl.<init>(RecordReaderImpl.java:257)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.ReaderImpl.rows(ReaderImpl.java:649)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createRecordReader(OrcReaderFactory.java:284)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:98)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:56)
>         at 
> org.apache.flink.table.store.file.utils.FileUtils.createFormatReader(FileUtils.java:108)
>         at 
> org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader.<init>(KeyValueDataFileRecordReader.java:55)
>         at 
> org.apache.flink.table.store.file.io.KeyValueFileReaderFactory.createRecordReader(KeyValueFileReaderFactory.java:95)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.lambda$readerForRun$1(MergeTreeReaders.java:89)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.create(ConcatRecordReader.java:50)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForRun(MergeTreeReaders.java:92)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForSection(MergeTreeReaders.java:74)
>         at 
> org.apache.flink.table.store.file.operation.KeyValueFileStoreRead.lambda$createReader$2(KeyValueFileStoreRead.java:195)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.create(ConcatRecordReader.java:50)
>         at 
> org.apache.flink.table.store.file.operation.KeyValueFileStoreRead.createReader(KeyValueFileStoreRead.java:204)
>         at 
> org.apache.flink.table.store.table.source.KeyValueTableRead.createReader(KeyValueTableRead.java:44)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceProvider.createPageSource(TrinoPageSourceProvider.java:76)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceProvider.lambda$createPageSource$0(TrinoPageSourceProvider.java:52)
>         at 
> org.apache.flink.table.store.trino.ClassLoaderUtils.runWithContextClassLoader(ClassLoaderUtils.java:30)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceProvider.createPageSource(TrinoPageSourceProvider.java:51)
>         at 
> io.trino.split.PageSourceManager.createPageSource(PageSourceManager.java:68)  
>       at 
> io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:308)     
>    at io.trino.operator.Driver.processInternal(Driver.java:388)        at 
> io.trino.operator.Driver.lambda$processFor$9(Driver.java:292)        at 
> io.trino.operator.Driver.tryWithLock(Driver.java:685)        at 
> io.trino.operator.Driver.processFor(Driver.java:285)        at 
> io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)
>         at 
> io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
>         at 
> io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:488)
>         at io.trino.$gen.Trino_366_0____20230313_034413_2.run(Unknown Source) 
>        at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at java.base/java.lang.Thread.run(Thread.java:834)  
> 2023-03-13T11:46:36.775+0800    ERROR   remote-task-callback-2  
> io.trino.execution.scheduler.PipelinedStageExecution    Pipelined stage 
> execution for stage 20230313_034504_00000_jdcet.1 
> failedjava.lang.ExceptionInInitializerError        at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderImpl.<init>(RecordReaderImpl.java:257)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.ReaderImpl.rows(ReaderImpl.java:649)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createRecordReader(OrcReaderFactory.java:284)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:98)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:56)
>         at 
> org.apache.flink.table.store.file.utils.FileUtils.createFormatReader(FileUtils.java:108)
>         at 
> org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader.<init>(KeyValueDataFileRecordReader.java:55)
>         at 
> org.apache.flink.table.store.file.io.KeyValueFileReaderFactory.createRecordReader(KeyValueFileReaderFactory.java:95)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.lambda$readerForRun$1(MergeTreeReaders.java:89)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.create(ConcatRecordReader.java:50)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForRun(MergeTreeReaders.java:92)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForSection(MergeTreeReaders.java:74)
>         at 
> org.apache.flink.table.store.file.operation.KeyValueFileStoreRead.lambda$createReader$2(KeyValueFileStoreRead.java:195)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.readBatch(ConcatRecordReader.java:65)
>         at 
> org.apache.flink.table.store.file.mergetree.DropDeleteReader.readBatch(DropDeleteReader.java:44)
>         at 
> org.apache.flink.table.store.table.source.KeyValueTableRead$RowDataRecordReader.readBatch(KeyValueTableRead.java:61)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.nextPage(TrinoPageSourceBase.java:120)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.getNextPage(TrinoPageSourceBase.java:113)
>         at 
> io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)     
>    at io.trino.operator.Driver.processInternal(Driver.java:388)        at 
> io.trino.operator.Driver.lambda$processFor$9(Driver.java:292)        at 
> io.trino.operator.Driver.tryWithLock(Driver.java:685)        at 
> io.trino.operator.Driver.processFor(Driver.java:285)        at 
> io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)
>         at 
> io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
>         at 
> io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:488)
>         at io.trino.$gen.Trino_366_0____20230313_034413_2.run(Unknown Source) 
>        at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at java.base/java.lang.Thread.run(Thread.java:834)Caused by: 
> java.lang.NumberFormatException: For input string: "Unknown"        at 
> java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>         at java.base/java.lang.Integer.parseInt(Integer.java:652)        at 
> java.base/java.lang.Integer.parseInt(Integer.java:770)        at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.HadoopShimsFactory.get(HadoopShimsFactory.java:53)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderUtils.<clinit>(RecordReaderUtils.java:47)
>         ... 30 more  2023-03-13T11:46:36.777+0800    ERROR   stage-scheduler 
> io.trino.execution.scheduler.SqlQueryScheduler  Failure in distributed stage 
> for query 20230313_034504_00000_jdcetjava.lang.ExceptionInInitializerError    
>     at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderImpl.<init>(RecordReaderImpl.java:257)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.ReaderImpl.rows(ReaderImpl.java:649)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createRecordReader(OrcReaderFactory.java:284)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:98)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:56)
>         at 
> org.apache.flink.table.store.file.utils.FileUtils.createFormatReader(FileUtils.java:108)
>         at 
> org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader.<init>(KeyValueDataFileRecordReader.java:55)
>         at 
> org.apache.flink.table.store.file.io.KeyValueFileReaderFactory.createRecordReader(KeyValueFileReaderFactory.java:95)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.lambda$readerForRun$1(MergeTreeReaders.java:89)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.create(ConcatRecordReader.java:50)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForRun(MergeTreeReaders.java:92)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForSection(MergeTreeReaders.java:74)
>         at 
> org.apache.flink.table.store.file.operation.KeyValueFileStoreRead.lambda$createReader$2(KeyValueFileStoreRead.java:195)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.readBatch(ConcatRecordReader.java:65)
>         at 
> org.apache.flink.table.store.file.mergetree.DropDeleteReader.readBatch(DropDeleteReader.java:44)
>         at 
> org.apache.flink.table.store.table.source.KeyValueTableRead$RowDataRecordReader.readBatch(KeyValueTableRead.java:61)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.nextPage(TrinoPageSourceBase.java:120)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.getNextPage(TrinoPageSourceBase.java:113)
>         at 
> io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)     
>    at io.trino.operator.Driver.processInternal(Driver.java:388)        at 
> io.trino.operator.Driver.lambda$processFor$9(Driver.java:292)        at 
> io.trino.operator.Driver.tryWithLock(Driver.java:685)        at 
> io.trino.operator.Driver.processFor(Driver.java:285)        at 
> io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)
>         at 
> io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
>         at 
> io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:488)
>         at io.trino.$gen.Trino_366_0____20230313_034413_2.run(Unknown Source) 
>        at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at java.base/java.lang.Thread.run(Thread.java:834)Caused by: 
> java.lang.NumberFormatException: For input string: "Unknown"        at 
> java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>         at java.base/java.lang.Integer.parseInt(Integer.java:652)        at 
> java.base/java.lang.Integer.parseInt(Integer.java:770)        at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.HadoopShimsFactory.get(HadoopShimsFactory.java:53)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderUtils.<clinit>(RecordReaderUtils.java:47)
>         ... 30 more  2023-03-13T11:46:36.784+0800    ERROR   stage-scheduler 
> io.trino.execution.StageStateMachine    Stage 20230313_034504_00000_jdcet.1 
> failedjava.lang.ExceptionInInitializerError        at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderImpl.<init>(RecordReaderImpl.java:257)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.ReaderImpl.rows(ReaderImpl.java:649)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createRecordReader(OrcReaderFactory.java:284)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:98)
>         at 
> org.apache.flink.table.store.format.orc.OrcReaderFactory.createReader(OrcReaderFactory.java:56)
>         at 
> org.apache.flink.table.store.file.utils.FileUtils.createFormatReader(FileUtils.java:108)
>         at 
> org.apache.flink.table.store.file.io.KeyValueDataFileRecordReader.<init>(KeyValueDataFileRecordReader.java:55)
>         at 
> org.apache.flink.table.store.file.io.KeyValueFileReaderFactory.createRecordReader(KeyValueFileReaderFactory.java:95)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.lambda$readerForRun$1(MergeTreeReaders.java:89)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.create(ConcatRecordReader.java:50)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForRun(MergeTreeReaders.java:92)
>         at 
> org.apache.flink.table.store.file.mergetree.MergeTreeReaders.readerForSection(MergeTreeReaders.java:74)
>         at 
> org.apache.flink.table.store.file.operation.KeyValueFileStoreRead.lambda$createReader$2(KeyValueFileStoreRead.java:195)
>         at 
> org.apache.flink.table.store.file.mergetree.compact.ConcatRecordReader.readBatch(ConcatRecordReader.java:65)
>         at 
> org.apache.flink.table.store.file.mergetree.DropDeleteReader.readBatch(DropDeleteReader.java:44)
>         at 
> org.apache.flink.table.store.table.source.KeyValueTableRead$RowDataRecordReader.readBatch(KeyValueTableRead.java:61)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.nextPage(TrinoPageSourceBase.java:120)
>         at 
> org.apache.flink.table.store.trino.TrinoPageSourceBase.getNextPage(TrinoPageSourceBase.java:113)
>         at 
> io.trino.operator.TableScanOperator.getOutput(TableScanOperator.java:311)     
>    at io.trino.operator.Driver.processInternal(Driver.java:388)        at 
> io.trino.operator.Driver.lambda$processFor$9(Driver.java:292)        at 
> io.trino.operator.Driver.tryWithLock(Driver.java:685)        at 
> io.trino.operator.Driver.processFor(Driver.java:285)        at 
> io.trino.execution.SqlTaskExecution$DriverSplitRunner.processFor(SqlTaskExecution.java:1076)
>         at 
> io.trino.execution.executor.PrioritizedSplitRunner.process(PrioritizedSplitRunner.java:163)
>         at 
> io.trino.execution.executor.TaskExecutor$TaskRunner.run(TaskExecutor.java:488)
>         at io.trino.$gen.Trino_366_0____20230313_034413_2.run(Unknown Source) 
>        at 
> java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128)
>         at 
> java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>         at java.base/java.lang.Thread.run(Thread.java:834)Caused by: 
> java.lang.NumberFormatException: For input string: "Unknown"        at 
> java.base/java.lang.NumberFormatException.forInputString(NumberFormatException.java:65)
>         at java.base/java.lang.Integer.parseInt(Integer.java:652)        at 
> java.base/java.lang.Integer.parseInt(Integer.java:770)        at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.HadoopShimsFactory.get(HadoopShimsFactory.java:53)
>         at 
> org.apache.flink.table.store.shaded.org.apache.orc.impl.RecordReaderUtils.<clinit>(RecordReaderUtils.java:47)
>         ... 30 more {code}
> Seems the common-version-info.properties file in 
> flink-shaded-hadoop-2-uber-2.8.3-10.0.jar is not found by the classloader. 
> The stacks tell that the call is from TrinoPageSourceBase.getNextPage, where 
> the classloader of the current thread is AppClassloader, rather than 
> PluginClassloader.
> Can we fix it by using runWithContextClassLoader to run 
> TrinoPageSourceBase.getNextPage with TrinoPageSourceBase.class.getClassloader?
>  



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to