Re: StreamTable Environment initialized failed -- "Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath"
Hi Sharil, I've tried your suggestion, but unfortunately it does not work, same exception. Any other ideas ? Thanks, Leo 在 2023/5/15 20:15, Sharil Shafie 写道: Hi, May be you could try table planner loader instead. org.apache.flink flink-table-planner-loader 1.16.0 provided Regards. On Mon, 15 May 2023, 18:54 krislee, wrote: Hi ALL, OS: CentOS 7.9 Flink version: 1.16.0 It looks like I'm hitting a notorious exception which had been discoverd since earlier fink version. The issue was triggered when below java code executed: StreamTableEnvironment tEnv = StreamTableEnvironment.create(env); More detailed trace is as below : Exception in thread "main" org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpath at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:109) at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:101) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:122) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:94) at com.sugon.cloud.paas.flink.cdc.FlinkCDC_mysql2doris_example.main(FlinkCDC_mysql2doris_example.java:63) Caused by: org.apache.flink.table.api.ValidationException: Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath. at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:533) at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:106) ... 4 more What I've done: 1) Added missed dependencies in "pom.xml", for example: org.apache.flink flink-table-api-java-uber 1.16.1 provided org.apache.flink flink-table-planner_${scala.binary.version} ${flink.version} provided 2)Tried two methods to run application, got same error(see above) mvn exec:java -Dexec.mainClass="xxx" java -jar target/xxx.jar I'm confused by the error because all necessary jar files does exist in Maven's local repository or FLINK_HOME's lib dir. The completed "pom.xml" is included in attachment. Thanks, Leo
StreamTable Environment initialized failed -- "Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath"
Hi ALL, OS: CentOS 7.9 Flink version: 1.16.0 It looks like I'm hitting a notorious exception which had been discoverd since earlier fink version. The issue was triggered when below java code executed: StreamTableEnvironment tEnv = StreamTableEnvironment.create(env); More detailed trace is as below : Exception in thread "main" org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpath at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:109) at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:101) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:122) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:94) at com.sugon.cloud.paas.flink.cdc.FlinkCDC_mysql2doris_example.main(FlinkCDC_mysql2doris_example.java:63) Caused by: org.apache.flink.table.api.ValidationException: Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath. at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:533) at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:106) ... 4 more What I've done: 1) Added missed dependencies in "pom.xml", for example: org.apache.flink flink-table-api-java-uber 1.16.1 provided org.apache.flink flink-table-planner_${scala.binary.version} ${flink.version} provided 2)Tried two methods to run application, got same error(see above) mvn exec:java -Dexec.mainClass="xxx" java -jar target/xxx.jar I'm confused by the error because all necessary jar files does exist in Maven's local repository or FLINK_HOME's lib dir. The completed "pom.xml" is included in attachment. Thanks, Leo http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> 4.0.0 com.mycompany.cloud.bigdata.flink flink-cdc-doris-example 1.0-SNAPSHOT jar UTF-8 1.8 1.8 1.16.0 2.3.0 2.12 org.apache.flink flink-java ${flink.version} org.apache.flink flink-streaming-java ${flink.version} org.apache.flink flink-table-api-java-bridge ${flink.version} com.ververica flink-connector-mysql-cdc ${flink.connector.version} org.apache.flink flink-table-planner_${scala.binary.version} ${flink.version} provided org.apache.flink flink-table-api-java-uber ${flink.version} provided org.apache.maven.plugins maven-compiler-plugin 3.8.1 ${maven.compiler.source} ${maven.compiler.target} org.apache.maven.plugins maven-shade-plugin 3.2.4 package shade com.mycompany.cloud.bigdata.flink.cdc.FlinkCDC_mysql2doris_example org.apache.flink:force-shading *:* META-INF/*.SF META-INF/*.DSA META-INF/*.RSA
StreamTable Environment initialized failed -- "Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath"
Hi ALL, OS: CentOS 7.9 Flink version: 1.16.0 It looks like I'm hitting a notorious exception which had been discoverd since earlier fink version. The issue was triggered when below java code executed: StreamTableEnvironment tEnv = StreamTableEnvironment.create(env); More detailed trace is as below : Exception in thread "main" org.apache.flink.table.api.TableException: Could not instantiate the executor. Make sure a planner module is on the classpath at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:109) at org.apache.flink.table.api.bridge.java.internal.StreamTableEnvironmentImpl.create(StreamTableEnvironmentImpl.java:101) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:122) at org.apache.flink.table.api.bridge.java.StreamTableEnvironment.create(StreamTableEnvironment.java:94) at com.sugon.cloud.paas.flink.cdc.FlinkCDC_mysql2doris_example.main(FlinkCDC_mysql2doris_example.java:63) Caused by: org.apache.flink.table.api.ValidationException: Could not find any factories that implement 'org.apache.flink.table.delegation.ExecutorFactory' in the classpath. at org.apache.flink.table.factories.FactoryUtil.discoverFactory(FactoryUtil.java:533) at org.apache.flink.table.api.bridge.internal.AbstractStreamTableEnvironmentImpl.lookupExecutor(AbstractStreamTableEnvironmentImpl.java:106) ... 4 more What I've done: 1) Added missed dependencies in "pom.xml", for example: org.apache.flink flink-table-api-java-uber 1.16.1 provided org.apache.flink flink-table-planner_${scala.binary.version} ${flink.version} provided 2)Tried two methods to run application, got same error(see above) mvn exec:java -Dexec.mainClass="xxx" java -jar target/xxx.jar I'm confused by the error because all necessary jar files does exist in Maven's local repository or FLINK_HOME's lib dir. The completed "pom.xml" is included in attachment. Thanks, Leo http://maven.apache.org/POM/4.0.0; xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance; xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd;> 4.0.0 com.mycompany.cloud.bigdata.flink flink-cdc-doris-example 1.0-SNAPSHOT jar UTF-8 1.8 1.8 1.16.0 2.3.0 2.12 org.apache.flink flink-java ${flink.version} org.apache.flink flink-streaming-java ${flink.version} org.apache.flink flink-table-api-java-bridge ${flink.version} com.ververica flink-connector-mysql-cdc ${flink.connector.version} org.apache.flink flink-table-planner_${scala.binary.version} ${flink.version} provided org.apache.flink flink-table-api-java-uber ${flink.version} provided org.apache.maven.plugins maven-compiler-plugin 3.8.1 ${maven.compiler.source} ${maven.compiler.target} org.apache.maven.plugins maven-shade-plugin 3.2.4 package shade com.mycompany.cloud.bigdata.flink.cdc.FlinkCDC_mysql2doris_example org.apache.flink:force-shading *:* META-INF/*.SF META-INF/*.DSA META-INF/*.RSA
flink 1.15.1 source compile failed on "annoation" module
Hi, I'm facing below error when compiling flink 1.15.1 src on windows(win10) . From error stack, it appears that compiling failed when compiling "annoation" module. 严重: Step 'google-java-format' found problem in 'src\main\java\org\apache\flink\annotation\docs\ConfigGroup.java': Unable to resolve dependencies com.diffplug.spotless.maven.ArtifactResolutionException: Unable to resolve dependencies at com.diffplug.spotless.maven.ArtifactResolver.resolveDependencies(ArtifactResolver.java:88) at com.diffplug.spotless.maven.ArtifactResolver.resolve(ArtifactResolver.java:74) at com.diffplug.spotless.JarState.provisionWithTransitives(JarState.java:68) at com.diffplug.spotless.JarState.from(JarState.java:57) at com.diffplug.spotless.JarState.from(JarState.java:52) at com.diffplug.spotless.java.GoogleJavaFormatStep$State.(GoogleJavaFormatStep.java:142) at com.diffplug.spotless.java.GoogleJavaFormatStep.lambda$create$0(GoogleJavaFormatStep.java:85) at com.diffplug.spotless.FormatterStepImpl.calculateState(FormatterStepImpl.java:56) at com.diffplug.spotless.LazyForwardingEquality.state(LazyForwardingEquality.java:56) at com.diffplug.spotless.FormatterStep$Strict.format(FormatterStep.java:76) at com.diffplug.spotless.Formatter.compute(Formatter.java:230) at com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:201) at com.diffplug.spotless.PaddedCell.calculateDirtyState(PaddedCell.java:188) at com.diffplug.spotless.maven.SpotlessCheckMojo.process(SpotlessCheckMojo.java:52) at com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:150) at com.diffplug.spotless.maven.AbstractSpotlessMojo.execute(AbstractSpotlessMojo.java:141) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:137) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:210) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:156) at org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:148) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build(SingleThreadedBuilder.java:56) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:305) at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:192) at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:105) at org.apache.maven.cli.MavenCli.execute(MavenCli.java:972) at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:293) at org.apache.maven.cli.MavenCli.main(MavenCli.java:196) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:282) at org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:225) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:406) at org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:347) Caused by: org.eclipse.aether.resolution.DependencyResolutionException: Failed to collect dependencies at com.google.googlejavaformat:google-java-format:jar:1.7 -> com.google.errorprone:javac-shaded:jar:9+181-r4173-1 at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:353) at com.diffplug.spotless.maven.ArtifactResolver.resolveDependencies(ArtifactResolver.java:86) ... 37 more Caused by: org.eclipse.aether.collection.DependencyCollectionException: Failed to collect dependencies at com.google.googlejavaformat:google-java-format:jar:1.7 -> com.google.errorprone:javac-shaded:jar:9+181-r4173-1 at org.eclipse.aether.internal.impl.collect.DefaultDependencyCollector.collectDependencies(DefaultDependencyCollector.java:288) at org.eclipse.aether.internal.impl.DefaultRepositorySystem.resolveDependencies(DefaultRepositorySystem.java:309) ... 38 more Caused by: org.eclipse.aether.resolution.ArtifactDescriptorException: Failed to read artifact descriptor for
flink job exception
各位好: 我是flink的初学者。 今天在flink web UI 和后台的job 管理页面 发现很多 exception: .. 11:29:30.107 [flink-akka.actor.default-dispatcher-41] ERROR org.apache.flink.runtime.rest.handler.job.JobExceptionsHandler - Exception occurred in REST handler: Job 16c614ab0d6f5b28746c66f351fb67f8 not found .. 此时,登录flink web UI, 在"completed jobs"页面 找不到任何job的历史信息,但是当初提交job的时候 是能看到这些job 信息的。 环境信息: flink: 1.12.4 for windows 启动flink和执行flink 作业使用的是 1.9.3版本的start-cluster.bat, flink.bat 我的疑问是:flink 是否有定期清理历史job的功能? 如果有,在哪里(通过命令行或者配置文件)可以配置相关的参数 ? 如果没有,这些错误信息是否正常 ?怎样解决这个问题 ? Thanks, Gang