[
https://issues.apache.org/jira/browse/FLINK-34992?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17943418#comment-17943418
]
Hao Li commented on FLINK-34992:
--------------------------------
Hi [~fsk119] ,
These are good question as I'm starting to look into these.
# Yes. We need ModelFactory, ModelSource and ModelRuntimeProvider like table
so that different runtime implementations can be used. As for detailed
implementation, as we discussed offline, it's still unknown. I think the main
hurdle of using model in function is how to resolve it in parser and what's the
type of it during evaluation. I'm still brainstorming on it.
# What I have in mind is `connector` which is used to create ModelSource hence
RuntimeProvider. `task` is needed since for remote model, it's just an endpoint
we can call, you don't know the model type. Even for local model, it's better
to explicitly specify it as it's convenient for evaluation. This is current
thoughts. For remote models, we also need `provider` config and their necessary
endpoint, auth method etc.
Hope this helps.
> FLIP-437: Support ML Models in Flink SQL
> ----------------------------------------
>
> Key: FLINK-34992
> URL: https://issues.apache.org/jira/browse/FLINK-34992
> Project: Flink
> Issue Type: New Feature
> Components: Table SQL / API, Table SQL / Planner, Table SQL / Runtime
> Reporter: Hao Li
> Priority: Major
>
> This is an umbrella task for FLIP-437. FLIP-437:
> https://cwiki.apache.org/confluence/display/FLINK/FLIP-437%3A+Support+ML+Models+in+Flink+SQL
--
This message was sent by Atlassian Jira
(v8.20.10#820010)