https://ci.apache.org/projects/flink/flink-docs-release-1.11/dev/table/hive/
中的 flink-sql-connector-hive-3.1.2 下载了么,放到lib里面了么?
在 2020/8/24 3:01, 黄蓉 写道:
各位好:
我使用的环境是HDP3.0.1的沙盒,flink是最新版本的1.11.1,从官网直接下载的编译好的jar包。我想测试flink与hive的集成,包括查询hive表的数据、写入数据到hive表等操作。目前我遇到问题就是通过flink
sql
client查询不出表数据,并且也不报错。但是该表在hive中查询是有记录的。其余的show
tables,show database等语句都可以正常显示。
配置的hadoop环境变量如下:
export HADOOP_CONF_DIR="/etc/hadoop/conf"
export HADOOP_HOME="/usr/hdp/3.0.1.0-187/hadoop"
export
HADOOP_CLASSPATH="/usr/hdp/current/hadoop-client/*:/usr/hdp/current/hadoop-client/lib/*:/usr/hdp/current/hadoop-hdfs-client/*:/usr/hdp/current/hadoop-hdfs-client/lib/*:/usr/hdp/current/hadoop-yarn-client/*:/usr/hdp/current/hadoop-yarn-client/lib/*:/usr/hdp/current/hadoop-mapreduce-client/*:/usr/hdp/current/hadoop-mapreduce-client/lib/*"
sql-client配置文件如下:
tables: []
functions: []
catalogs:
- name: myhive
type: hive
hive-conf-dir: /opt/hive-conf
execution:
planner: blink
type: batch
result-mode: table
max-table-result-rows: 1000000
parallelism: 3
max-parallelism: 128
min-idle-state-retention: 0
max-idle-state-retention: 0
current-catalog: myhive
current-database: default
restart-strategy:
type: fallback
deployment:
response-timeout: 5000
gateway-address: ""
gateway-port: 0
请问出现这种情况是不是官网的flink包与hdp3.0.1不兼容?我需要自己重新编译flink吗?
Jessie
jessie...@gmail.com