Re: How to make two SQLs use the same KafkaTableSource?

2019-08-09 Thread Tony Wei
Hi Zhenghua, Blink planner support lazy translation for multiple SQLs, and the common > nodes will be reused in a single job. > It is very helpful, and thanks for your clarification. > The only thing you need note here is the unified TableEnvironmentImpl do > not support conversions between Tab

Re: How to make two SQLs use the same KafkaTableSource?

2019-08-08 Thread Zhenghua Gao
Blink planner support lazy translation for multiple SQLs, and the common nodes will be reused in a single job. The only thing you need note here is the unified TableEnvironmentImpl do not support conversions between Table(s) and Stream(s). U must use pure SQL api (DDL/DML by sqlUpdate, DQL by sqlQu

Re: How to make two SQLs use the same KafkaTableSource?

2019-08-08 Thread Tony Wei
forgot to send to user mailing list. Tony Wei 於 2019年8月9日 週五 下午12:36寫道: > Hi Zhenghua, > > I didn't get your point. It seems that `isEagerOperationTranslation` is > always return false. Is that > means even I used Blink planner, the sql translation is still in a lazy > manner? > > Or do you mean

How to make two SQLs use the same KafkaTableSource?

2019-08-08 Thread Tony Wei
Hi, I used `flinkTableEnv.connect(new Kafka()...).registerTableSource(...)` to register my kafka table. However, I found that because SQL is a lazy operation, it will convert to DataStream under some criteria. For example, `Table#toRetractStream`. So, when I used two SQLs in one application job,