hi, @Jun Zhang 我一直使用的就是blink planner,这个jar包一直都有的。
@Jark Wu 我是在本地idea中直接运行的,还没有打包到集群跑。跟这个有关系么? 在 2020-07-07 15:40:17,"Jark Wu" <imj...@gmail.com> 写道: >Hi, > >你是作业打包后在集群执行的,还是在 IDEA 中运行的呢? > >Best, >Jark > >On Tue, 7 Jul 2020 at 15:31, Jun Zhang <zhangjunemail...@gmail.com> wrote: > >> hi.sunfulin >> 你有没有导入blink的planner呢,加入这个试试 >> >> <dependency> >> <groupId>org.apache.flink</groupId> >> >> <artifactId>flink-table-planner-blink_${scala.binary.version}</artifactId> >> <version>${flink.version}</version> >> </dependency> >> >> >> sunfulin <sunfulin0...@163.com> 于2020年7月7日周二 下午3:21写道: >> >>> >>> >>> >>> hi, jark >>> 我的执行代码其实很简单,就是下面的执行逻辑。不知道是不是我缺了什么依赖配置。我debug看了下异常执行,是说Flink >>> configuration里的DeployOptions.TARGET >>> (execution.target)没有匹配到配置?之前貌似从没有关注过这个配置。 >>> >>> >>> //构建StreamExecutionEnvironment >>> public static final StreamExecutionEnvironment env = >>> StreamExecutionEnvironment.getExecutionEnvironment(); >>> >>> //构建EnvironmentSettings 并指定Blink Planner >>> private static final EnvironmentSettings bsSettings = >>> EnvironmentSettings.newInstance().useBlinkPlanner().inStreamingMode().build(); >>> >>> //构建StreamTableEnvironment >>> public static final StreamTableEnvironment tEnv = >>> StreamTableEnvironment.create(env, bsSettings); >>> >>> >>> >>> >>> >>> tEnv.executeSql(“ddl sql”); >>> >>> >>> >>> >>> //source注册成表 >>> >>> tEnv.createTemporaryView("test", ds, $("f0").as("id"), >>> $("f1").as("first"), $("p").proctime()); >>> >>> >>> >>> >>> //join语句 >>> >>> Table table = tEnv.sqlQuery("select b.* from test a left join >>> my_dim FOR SYSTEM_TIME AS OF a.p AS b on a.first = b.userId"); >>> >>> >>> >>> >>> //输出 >>> >>> tEnv.toAppendStream(table, Row.class).print("LookUpJoinJob"); >>> >>> >>> >>> >>> env.execute("LookUpJoinJob"); >>> >>> >>> >>> >>> >>> >>> >>> >>> 在 2020-07-06 14:59:17,"Jark Wu" <imj...@gmail.com> 写道: >>> >能分享下复现的作业代码不? >>> > >>> >Best, >>> >Jark >>> > >>> >On Mon, 6 Jul 2020 at 11:00, sunfulin <sunfulin0...@163.com> wrote: >>> > >>> >> Hi, >>> >> 我使用目前最新的Flink 1.11 rc4来测试我的作业。报了如下异常: >>> >> org.apache.flink.table.api.TableExecution: Failed to execute sql >>> >> >>> >> >>> >> caused by : java.lang.IlleagalStateException: No ExecutorFactory found >>> to >>> >> execute the application. >>> >> at >>> >> >>> org.apache.flink.core.execution.DefaultExecutorServiceLoader.getExecutorFactory(DefaultExecutorServiceLoader.java:84) >>> >> >>> >> >>> >> 想请教下这个异常是啥原因?我使用1.10.1跑同样的逻辑,是没有异常的。 >>> >>