lib目录下,需要放置一下flink-sql-connector-hive-3.1.3,这个包是给sql作业用的




--

    Best!
    Xuyang





在 2024-07-16 13:40:23,"冯奇" <ha.fen...@aisino.com> 写道:
>我看了下文档,几个包都在,还有一个单独下载依赖的包flink-sql-connector-hive-3.1.3,不知道是使用这个还是下面的?
>// Flink's Hive connector  flink-connector-hive_2.12-1.19.1.jar  // Hive 
>dependencies  hive-exec-3.1.0.jar  libfb303-0.9.3.jar // libfb303 is not 
>packed into hive-exec in some versions, need to add it separately  // add 
>antlr-runtime if you need to use hive dialect  antlr-runtime-3.5.2.jar
>lib下面的包
>antlr-runtime-3.5.2.jar flink-table-api-java-1.19.0.jar 
>flink-cdc-dist-3.0.0.jar flink-table-api-java-uber-1.19.0.jar 
>flink-cdc-pipeline-connector-doris-3.1.0.jar flink-table-common-1.19.0.jar 
>flink-cdc-pipeline-connector-mysql-3.1.0.jar 
>flink-table-planner-loader-1.19.0.jar flink-cep-1.19.0.jar 
>flink-table-runtime-1.19.0.jar flink-connector-files-1.19.0.jar 
>hive-exec-3.1.2.jar flink-connector-hive_2.12-1.19.0.jar libfb303-0.9.3.jar 
>flink-connector-jdbc-3.1.2-1.18.jar log4j-1.2-api-2.17.1.jar 
>flink-connector-kafka-3.1.0-1.18.jar log4j-api-2.17.1.jar flink-csv-1.19.0.jar 
>log4j-core-2.17.1.jar flink-dist-1.19.0.jar log4j-slf4j-impl-2.17.1.jar 
>flink-json-1.19.0.jar mysql-connector-java-8.0.28.jar 
>flink-scala_2.12-1.19.0.jar paimon-flink-1.19-0.9-20240628.002224-23.jar
>------------------------------------------------------------------
>发件人:Xuyang <xyzhong...@163.com>
>发送时间:2024年7月16日(星期二) 11:43
>收件人:"user-zh"<user-zh@flink.apache.org>
>主 题:Re:使用hive的catalog问题
>Hi, 可以check一下是否将hive sql connector的依赖[1]放入lib目录下或者add jar了吗?
>[1] 
>https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
> 
><https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/hive/overview/
> >
>--
> Best!
> Xuyang
>At 2024-07-15 17:09:45, "冯奇" <ha.fen...@aisino.com> wrote:
>>Flink SQL> USE CATALOG myhive;
>>Flink SQL> CREATE TABLE mykafka (name String, age Int) WITH (
>> 'connector.type' = 'kafka',
>> 'connector.version' = 'universal',
>> 'connector.topic' = 'hive_sink',
>> 'connector.properties.bootstrap.servers' = '10.0.15.242:9092',
>> 'format.type' = 'csv',
>> 'update-mode' = 'append'
>>);
>>提示下面错误:
>>[ERROR] Could not execute SQL statement. Reason:
>>org.apache.flink.table.factories.NoMatchingTableFactoryException: Could not 
>>find a suitable table factory for 
>>'org.apache.flink.table.factories.TableSourceFactory' in
>>the classpath.
>>Reason: Required context properties mismatch.
>>The matching candidates:
>>org.apache.flink.table.sources.CsvAppendTableSourceFactory
>>Mismatched properties:
>>'connector.type' expects 'filesystem', but is 'kafka'
>>The following properties are requested:
>>connector.properties.bootstrap.servers=10.0.15.242:9092
>>connector.topic=hive_sink
>>connector.type=kafka
>>connector.version=universal
>>format.type=csv
>>schema.0.data-type=VARCHAR(2147483647)
>>schema.0.name=name
>>schema.1.data-type=INT
>>schema.1.name=age
>>update-mode=append
>>The following factories have been considered:
>>org.apache.flink.table.sources.CsvBatchTableSourceFactory
>>org.apache.flink.table.sources.CsvAppendTableSourceFactory

回复