我英文能力不允许啊。。。。


18579099...@163.com
 
发件人: yuxia
发送时间: 2022-05-11 15:11
收件人: user-zh
主题: Re: flink sql无法读取Hive映射的HBase表
不好意思,我尝试复现你的问题,但是我没有 hbase 环境,不过看起来是只有当 STORED BY 
  'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 有问题?
我之后空了再debug 看看。
 
不过我看了一下 flink 这块的代码,从 flink 这块的代码来看,应该是 get 这个 hive 表之后,它的 StorageDescriptor 的 
inputformat 为 null,然后 Class.forName(inputformat) 就报错 NPE了。
应该是这块代码有点问题。
如果你方便的话,可以辛苦帮忙建一个 jira~
https://issues.apache.org/jira/projects/FLINK/summary
 
 
 
Best regards,
Yuxia
 
----- 原始邮件 -----
发件人: 18579099...@163.com
收件人: "user-zh" <user-zh@flink.apache.org>
发送时间: 星期二, 2022年 5 月 10日 上午 10:39:16
主题: Re: Re: flink sql无法读取Hive映射的HBase表
 
版本:
flink:1.13.6
hive:2.1.1-cdh6.2.0
hbase:2.1.0-cdh6.2.0
flinksql执行工具:flink sql client 
sql 提交模式:yarn-per-job
-------------------------------------------------------------------------------------------------------------------------------------------------
flink lib目录下的包
antlr-runtime-3.5.2.jar
flink-csv-1.13.6.jar
flink-dist_2.11-1.13.6.jar
flink-json-1.13.6.jar
flink-shaded-zookeeper-3.4.14.jar
flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar
flink-table_2.11-1.13.6.jar
flink-table-blink_2.11-1.13.6.jar
guava-14.0.1.jar
hadoop-mapreduce-client-core-3.0.0-cdh6.2.0.jar
hbase-client-2.1.0-cdh6.2.0.jar
hbase-common-2.1.0-cdh6.2.0.jar
hbase-protocol-2.1.0-cdh6.2.0.jar
hbase-server-2.1.0-cdh6.2.0.jar
hive-exec-2.1.1-cdh6.2.0.jar
hive-hbase-handler-2.1.1-cdh6.2.0.jar
htrace-core4-4.1.0-incubating.jar
log4j-1.2-api-2.17.1.jar
log4j-api-2.17.1.jar
log4j-core-2.17.1.jar
log4j-slf4j-impl-2.17.1.jar
protobuf-java-2.5.0.jar
----------------------------------------------------------------------------------------------------------------------------------------
hive建表语句
CREATE EXTERNAL TABLE `ods.student`(
  `row_key` string, 
  `name` string,
  `age` int,
  `addr` string 
) 
ROW FORMAT SERDE 
  'org.apache.hadoop.hive.hbase.HBaseSerDe' 
STORED BY 
  'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 
WITH SERDEPROPERTIES ( 
  
'hbase.columns.mapping'=':key,FINAL:NAME,FINAL:AGE,FINAL:ADDR,'serialization.format'='1')
TBLPROPERTIES (
  'hbase.table.name'='ODS:STUDENT') ;
catalog:hive catalog 
sql: select * from ods.student;
我看了报错信息之后添加了一些jar包到flink lib下,之前报的错跟缺少依赖有关。现在又出现了新的错误。
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.connectors.hive.FlinkHiveException: Unable to instantiate the 
hadoop input format
--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
详细的堆栈
org.apache.flink.table.client.gateway.SqlExecutionException: Could not execute 
SQL statement.
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:215)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:235)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:479) 
~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:412) 
~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.cli.CliClient.lambda$executeStatement$0(CliClient.java:327)
 [flink-sql-client_2.11-1.13.6.jar:1.13.6]
at java.util.Optional.ifPresent(Optional.java:159) ~[?:1.8.0_191]
at 
org.apache.flink.table.client.cli.CliClient.executeStatement(CliClient.java:327)
 [flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:297)
 [flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:221)
 [flink-sql-client_2.11-1.13.6.jar:1.13.6]
at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151) 
[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95) 
[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at org.apache.flink.table.client.SqlClient.startClient(SqlClient.java:187) 
[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at org.apache.flink.table.client.SqlClient.main(SqlClient.java:161) 
[flink-sql-client_2.11-1.13.6.jar:1.13.6]
Caused by: org.apache.flink.connectors.hive.FlinkHiveException: Unable to 
instantiate the hadoop input format
at 
org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createMRSplits(HiveSourceFileEnumerator.java:100)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createInputSplits(HiveSourceFileEnumerator.java:71)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource.lambda$getDataStream$1(HiveTableSource.java:212)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveParallelismInference.logRunningTime(HiveParallelismInference.java:107)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveParallelismInference.infer(HiveParallelismInference.java:95)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource.getDataStream(HiveTableSource.java:207)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource$1.produceDataStream(HiveTableSource.java:123)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecTableSourceScan.translateToPlanInternal(CommonExecTableSourceScan.java:96)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecEdge.translateToPlan(ExecEdge.java:247)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:114)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$1.apply(StreamPlanner.scala:70)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$1.apply(StreamPlanner.scala:69)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.13.6.jar:1.13.6]
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.Iterator$class.foreach(Iterator.scala:891) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:69)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:165)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1518)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeQueryOperation(TableEnvironmentImpl.java:791)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1225)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
... 12 more
Caused by: java.lang.NullPointerException
at java.lang.Class.forName0(Native Method) ~[?:1.8.0_191]
at java.lang.Class.forName(Class.java:348) ~[?:1.8.0_191]
at 
org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createMRSplits(HiveSourceFileEnumerator.java:94)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveSourceFileEnumerator.createInputSplits(HiveSourceFileEnumerator.java:71)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource.lambda$getDataStream$1(HiveTableSource.java:212)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveParallelismInference.logRunningTime(HiveParallelismInference.java:107)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveParallelismInference.infer(HiveParallelismInference.java:95)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource.getDataStream(HiveTableSource.java:207)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.connectors.hive.HiveTableSource$1.produceDataStream(HiveTableSource.java:123)
 ~[flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.common.CommonExecTableSourceScan.translateToPlanInternal(CommonExecTableSourceScan.java:96)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecEdge.translateToPlan(ExecEdge.java:247)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.stream.StreamExecSink.translateToPlanInternal(StreamExecSink.java:114)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.plan.nodes.exec.ExecNodeBase.translateToPlan(ExecNodeBase.java:134)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$1.apply(StreamPlanner.scala:70)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner$$anonfun$1.apply(StreamPlanner.scala:69)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.13.6.jar:1.13.6]
at 
scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
 ~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.Iterator$class.foreach(Iterator.scala:891) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractIterator.foreach(Iterator.scala:1334) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.IterableLike$class.foreach(IterableLike.scala:72) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractIterable.foreach(Iterable.scala:54) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at scala.collection.AbstractTraversable.map(Traversable.scala:104) 
~[flink-dist_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.StreamPlanner.translateToPlan(StreamPlanner.scala:69)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.planner.delegation.PlannerBase.translate(PlannerBase.scala:165)
 ~[flink-table-blink_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.translate(TableEnvironmentImpl.java:1518)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeQueryOperation(TableEnvironmentImpl.java:791)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.api.internal.TableEnvironmentImpl.executeInternal(TableEnvironmentImpl.java:1225)
 ~[flink-table_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.lambda$executeOperation$3(LocalExecutor.java:213)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.context.ExecutionContext.wrapClassLoader(ExecutionContext.java:90)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
at 
org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:213)
 ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
... 12 more
 
 
 
 
 
 
 
18579099...@163.com
发件人: yuxia
发送时间: 2022-05-10 09:32
收件人: user-zh
主题: Re: flink sql无法读取Hive映射的HBase表
用的是 Hive Catalog 吗? Hive connector 和 Hive 的版本 都是多少呢?
另外,详细堆栈贴一下。
Best regards,
Yuxia
----- 原始邮件 -----
发件人: 18579099...@163.com
收件人: "user-zh" <user-zh@flink.apache.org>
发送时间: 星期一, 2022年 5 月 09日 下午 5:46:02
主题: flink sql无法读取Hive映射的HBase表
我有一部分表的数据是存在hbase上的,平时通过hive加载外部表的方式读取hbase的数据,我想通过flink sql读取hive表的方式
读取数据(不直接使用flink 读取hbase是我使用的catalog是hive,不用再写建表语句然后再查),当我用sql-client尝试的时候报错。
读取正常的hive是可以正常读取的,但是读取hive on hbase表却报
[ERROR] Could not execute SQL statement. Reason:
org.apache.flink.table.catalog.exceptions.CatalogException: Failed to get table 
schema from deserializer。
不知道有没有什么办法可以解决这个问题,使用spark引擎是可以读取到数据的。
18579099...@163.com

回复