Hi,
I used offical flink-1.12.5 package,configuration sql-client-defaults.yaml,run
bin/sql-client.sh embedded
cat conf/sql-client-defaults.yaml
catalogs:
# A typical catalog definition looks like:
- name: myhive
type: hive
hive-conf-dir: /apps/conf/hive
default-database: default
How to solve?
在 2021年11月1日 18:32,Jingsong Li<[email protected]> 写道:
Hi,
If you are using sql-client, you can try:
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/sqlclient/#execute-a-set-of-sql-statements
If you are using TableEnvironment, you can try statement set too:
https://nightlies.apache.org/flink/flink-docs-master/docs/dev/table/common/#translate-and-execute-a-query
Best,
Jingsong
On Fri, Oct 29, 2021 at 7:01 PM Jake <[email protected]<mailto:[email protected]>>
wrote:
>
> Hi
>
> You can use like this:
>
> ```java
>
> val calciteParser = new
> CalciteParser(SqlUtil.getSqlParserConfig(tableEnv.getConfig))
> sqlArr
> .foreach(item => {
> println(item)
> val itemNode = calciteParser.parse(item)
>
> itemNode match {
> case sqlSet: SqlSet => {
> configuration.setString(sqlSet.getKeyString,
> sqlSet.getValueString)
> }
> case _: RichSqlInsert => insertSqlBuffer += item
> case _ => {
> println(item)
> val itemResult = tableEnv.executeSql(item)
> itemResult.print()
> }
> }
> })
>
> // execute batch inserts
> if (insertSqlBuffer.size > 0) {
> insertSqlBuffer.foreach(item => {
> println("insert sql: " + item)
> statementSet.addInsertSql(item)
> })
> val explain = statementSet.explain()
> println(explain)
> statementSet.execute()
> }
>
>
>
> ```
>
>
> On Oct 29, 2021, at 18:50, wx liao
> <[email protected]<mailto:[email protected]>> wrote:
>
> Hi:
> I use flink sql,and run a script that has one souce an two sink,I can see 2
> jobs runing through webUI,is that normal?
> Can some way to ensure only run on job that has one source and two sink?
> Thank you
>
>
--
Best, Jingsong Lee
重要声明:此邮件中包含的信息为特许和保密信息,只能用于上述收件人以及其他已获得接收授权的收件人。如果您不是此邮件的预期收件人,请勿阅读、复制、转发或存储此邮件。如果已误收此邮件,请将其转发到发件人,并从您的计算机系统彻底删除此邮件。感谢您。