[ 
https://issues.apache.org/jira/browse/FLINK-7151?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16083424#comment-16083424
 ] 

Jark Wu commented on FLINK-7151:
--------------------------------

The idea is very good. But currently, Flink doesn't provide a Catalog (like 
HCatalog in Hive) which is a storage to manage tables or functions. So all the 
tables and functions in Flink is job level, not cross jobs (which is 
"temporary" I think).

However, Flink provide {{ExternalCatalog}} API to the connect an external 
database catalog to Flink's Table API. So I think, maybe you can extend the 
{{ExternalCatalog}} to support register external functions (which is 
non-temporary functions). 

> FLINK SQL support create temporary function and table
> -----------------------------------------------------
>
>                 Key: FLINK-7151
>                 URL: https://issues.apache.org/jira/browse/FLINK-7151
>             Project: Flink
>          Issue Type: New Feature
>          Components: Table API & SQL
>            Reporter: yuemeng
>
> Based on create temporary function and table.we can register a udf,udaf,udtf 
> use sql:
> {code}
> CREATE TEMPORARY function 'TOPK' AS 
> 'com.xxxx.aggregate.udaf.distinctUdaf.topk.ITopKUDAF';
> INSERT INTO db_sink SELECT id, TOPK(price, 5, 'DESC') FROM kafka_source GROUP 
> BY id;
> {code}



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

Reply via email to