[ https://issues.apache.org/jira/browse/HUDI-4542?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17699198#comment-17699198 ]
Jarred Li edited comment on HUDI-4542 at 3/11/23 9:26 AM: ---------------------------------------------------------- I have same issue. I copy hive-exec-2.3.1-core.jar into /usr/lib/flink/lib was (Author: leejianwei): I have same issue. Can I know what is the workaround? I copy hive-exec.jar into /usr/lib/flink/lib, however there is class conflict. > Flink streaming query fails with ClassNotFoundException > ------------------------------------------------------- > > Key: HUDI-4542 > URL: https://issues.apache.org/jira/browse/HUDI-4542 > Project: Apache Hudi > Issue Type: Bug > Components: flink-sql > Reporter: Ethan Guo > Priority: Critical > Fix For: 0.13.1 > > Attachments: Screen Shot 2022-08-04 at 17.17.42.png > > > Environment: EMR 6.7.0 Flink 1.14.2 > Reproducible steps: Build Hudi Flink bundle from master > {code:java} > mvn clean package -DskipTests -pl :hudi-flink1.14-bundle -am {code} > Copy to EMR master node /lib/flink/lib > Launch Flink SQL client: > {code:java} > cd /lib/flink && ./bin/yarn-session.sh --detached > ./bin/sql-client.sh {code} > Write a Hudi table with a few commits with metadata table enabled (no column > stats). Then, run the following for the streaming query > {code:java} > CREATE TABLE t2( > uuid VARCHAR(20) PRIMARY KEY NOT ENFORCED, > name VARCHAR(10), > age INT, > ts TIMESTAMP(3), > `partition` VARCHAR(20) > ) > PARTITIONED BY (`partition`) > WITH ( > 'connector' = 'hudi', > 'path' = 's3a://<table_path>', > 'table.type' = 'MERGE_ON_READ', > 'read.streaming.enabled' = 'true', -- this option enable the streaming > read > 'read.start-commit' = '20220803165232362', -- specifies the start commit > instant time > 'read.streaming.check-interval' = '4' -- specifies the check interval for > finding new source commits, default 60s. > ); {code} > {code:java} > select * from t2; {code} > {code:java} > Flink SQL> select * from t2; > 2022-08-05 00:12:43,635 INFO org.apache.hadoop.metrics2.impl.MetricsConfig > [] - Loaded properties from hadoop-metrics2.properties > 2022-08-05 00:12:43,650 INFO > org.apache.hadoop.metrics2.impl.MetricsSystemImpl [] - Scheduled > Metric snapshot period at 300 second(s). > 2022-08-05 00:12:43,650 INFO > org.apache.hadoop.metrics2.impl.MetricsSystemImpl [] - > s3a-file-system metrics system started > 2022-08-05 00:12:47,722 INFO org.apache.hadoop.fs.s3a.S3AInputStream > [] - Switching to Random IO seek policy > 2022-08-05 00:12:47,941 INFO org.apache.hadoop.yarn.client.RMProxy > [] - Connecting to ResourceManager at > ip-172-31-9-157.us-east-2.compute.internal/172.31.9.157:8032 > 2022-08-05 00:12:47,942 INFO org.apache.hadoop.yarn.client.AHSProxy > [] - Connecting to Application History server at > ip-172-31-9-157.us-east-2.compute.internal/172.31.9.157:10200 > 2022-08-05 00:12:47,942 INFO org.apache.flink.yarn.YarnClusterDescriptor > [] - No path for the flink jar passed. Using the location of > class org.apache.flink.yarn.YarnClusterDescriptor to locate the jar > 2022-08-05 00:12:47,942 WARN org.apache.flink.yarn.YarnClusterDescriptor > [] - Neither the HADOOP_CONF_DIR nor the YARN_CONF_DIR > environment variable is set.The Flink YARN Client needs one of these to be > set to properly load the Hadoop configuration for accessing YARN. > 2022-08-05 00:12:47,959 INFO org.apache.flink.yarn.YarnClusterDescriptor > [] - Found Web Interface > ip-172-31-3-92.us-east-2.compute.internal:39605 of application > 'application_1659656614768_0001'. > [ERROR] Could not execute SQL statement. Reason: > java.lang.ClassNotFoundException: > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat{code} > {code:java} > 2022-08-04 17:12:59 > org.apache.flink.runtime.JobException: Recovery is suppressed by > NoRestartBackoffTimeStrategy > at > org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138) > at > org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82) > at > org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:228) > at > org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:218) > at > org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:209) > at > org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:679) > at > org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:79) > at > org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:444) > at sun.reflect.GeneratedMethodAccessor35.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:316) > at > org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83) > at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:314) > at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217) > at > org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78) > at > org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163) > at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24) > at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20) > at scala.PartialFunction.applyOrElse(PartialFunction.scala:123) > at scala.PartialFunction.applyOrElse$(PartialFunction.scala:122) > at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20) > at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171) > at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) > at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:172) > at akka.actor.Actor.aroundReceive(Actor.scala:537) > at akka.actor.Actor.aroundReceive$(Actor.scala:535) > at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220) > at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580) > at akka.actor.ActorCell.invoke(ActorCell.scala:548) > at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270) > at akka.dispatch.Mailbox.run(Mailbox.scala:231) > at akka.dispatch.Mailbox.exec(Mailbox.scala:243) > at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) > at > java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) > at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) > at > java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175) > Caused by: java.lang.NoClassDefFoundError: > org/apache/hadoop/hive/ql/io/parquet/MapredParquetInputFormat > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:756) > at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:473) > at java.net.URLClassLoader.access$100(URLClassLoader.java:74) > at java.net.URLClassLoader$1.run(URLClassLoader.java:369) > at java.net.URLClassLoader$1.run(URLClassLoader.java:363) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:362) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:756) > at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:473) > at java.net.URLClassLoader.access$100(URLClassLoader.java:74) > at java.net.URLClassLoader$1.run(URLClassLoader.java:369) > at java.net.URLClassLoader$1.run(URLClassLoader.java:363) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:362) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > at java.lang.ClassLoader.defineClass1(Native Method) > at java.lang.ClassLoader.defineClass(ClassLoader.java:756) > at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142) > at java.net.URLClassLoader.defineClass(URLClassLoader.java:473) > at java.net.URLClassLoader.access$100(URLClassLoader.java:74) > at java.net.URLClassLoader$1.run(URLClassLoader.java:369) > at java.net.URLClassLoader$1.run(URLClassLoader.java:363) > at java.security.AccessController.doPrivileged(Native Method) > at java.net.URLClassLoader.findClass(URLClassLoader.java:362) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > at > org.apache.hudi.sink.partitioner.profile.WriteProfiles.getCommitMetadata(WriteProfiles.java:236) > at > org.apache.hudi.source.IncrementalInputSplits.lambda$inputSplits$2(IncrementalInputSplits.java:284) > at > java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) > at > java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1384) > at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:482) > at > java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:472) > at > java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) > at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) > at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:566) > at > org.apache.hudi.source.IncrementalInputSplits.inputSplits(IncrementalInputSplits.java:284) > at > org.apache.hudi.source.StreamReadMonitoringFunction.monitorDirAndForwardSplits(StreamReadMonitoringFunction.java:199) > at > org.apache.hudi.source.StreamReadMonitoringFunction.run(StreamReadMonitoringFunction.java:172) > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:116) > at > org.apache.flink.streaming.api.operators.StreamSource.run(StreamSource.java:73) > at > org.apache.flink.streaming.runtime.tasks.SourceStreamTask$LegacySourceFunctionThread.run(SourceStreamTask.java:323) > Caused by: java.lang.ClassNotFoundException: > org.apache.hadoop.hive.ql.io.parquet.MapredParquetInputFormat > at java.net.URLClassLoader.findClass(URLClassLoader.java:387) > at java.lang.ClassLoader.loadClass(ClassLoader.java:418) > at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:352) > at java.lang.ClassLoader.loadClass(ClassLoader.java:351) > ... 51 more > {code} > !Screen Shot 2022-08-04 at 17.17.42.png|width=1127,height=400! > -- This message was sent by Atlassian Jira (v8.20.10#820010)