Hi, 可以check一下是否将hive sql connector的依赖[1]放入lib目录下或者add jar了吗?




[1] 
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/




--

    Best!
    Xuyang





At 2024-07-15 17:09:45, "冯奇" <ha.fen...@aisino.com> wrote:
>Flink SQL> USE CATALOG myhive;
>Flink SQL> CREATE TABLE mykafka (name String, age Int) WITH (
> 'connector.type' = 'kafka',
> 'connector.version' = 'universal',
> 'connector.topic' = 'hive_sink',
> 'connector.properties.bootstrap.servers' = '10.0.15.242:9092',
> 'format.type' = 'csv',
> 'update-mode' = 'append'
>);
>提示下面错误:
>[ERROR] Could not execute SQL statement. Reason:
>org.apache.flink.table.factories.NoMatchingTableFactoryException: Could not 
>find a suitable table factory for 
>'org.apache.flink.table.factories.TableSourceFactory' in
>the classpath.
>Reason: Required context properties mismatch.
>The matching candidates:
>org.apache.flink.table.sources.CsvAppendTableSourceFactory
>Mismatched properties:
>'connector.type' expects 'filesystem', but is 'kafka'
>The following properties are requested:
>connector.properties.bootstrap.servers=10.0.15.242:9092
>connector.topic=hive_sink
>connector.type=kafka
>connector.version=universal
>format.type=csv
>schema.0.data-type=VARCHAR(2147483647)
>schema.0.name=name
>schema.1.data-type=INT
>schema.1.name=age
>update-mode=append
>The following factories have been considered:
>org.apache.flink.table.sources.CsvBatchTableSourceFactory
>org.apache.flink.table.sources.CsvAppendTableSourceFactory

回复