Re: Re: flink sql无法读取Hive映射的HBase表

2022-05-18 文章 Jingsong Li
Thanks~ 非常好~

Best,
Jingsong

On Mon, May 16, 2022 at 5:24 PM 18579099...@163.com <18579099...@163.com>
wrote:

> 第一次弄,不知道这么写的对不对
>
> https://issues.apache.org/jira/projects/FLINK/issues/FLINK-27604
>
>
>
> 18579099...@163.com
>
> 发件人: Jingsong Li
> 发送时间: 2022-05-13 15:06
> 收件人: user-zh
> 主题: Re: Re: flink sql无法读取Hive映射的HBase表
> Hi, 推荐 https://www.deepl.com/translator
> 非常好用
>
> 我记得对Hive Custom Storage Handler(hbase)是有问题的
>
> Best,
> Jingsong
>
> On Fri, May 13, 2022 at 2:12 PM 18579099...@163.com <18579099...@163.com>
> wrote:
>
> > 我英文能力不允许啊
> >
> >
> >
> > 18579099...@163.com
> >
> > 发件人: yuxia
> > 发送时间: 2022-05-11 15:11
> > 收件人: user-zh
> > 主题: Re: flink sql无法读取Hive映射的HBase表
> > 不好意思,我尝试复现你的问题,但是我没有 hbase 环境,不过看起来是只有当 STORED BY
> >   'org.apache.hadoop.hive.hbase.HBaseStorageHandler' 有问题?
> > 我之后空了再debug 看看。
> >
> > 不过我看了一下 flink 这块的代码,从 flink 这块的代码来看,应该是 get 这个 hive 表之后,它的
> > StorageDescriptor 的 inputformat 为 null,然后 Class.forName(inputformat) 就报错
> > NPE了。
> > 应该是这块代码有点问题。
> > 如果你方便的话,可以辛苦帮忙建一个 jira~
> > https://issues.apache.org/jira/projects/FLINK/summary
> >
> >
> >
> > Best regards,
> > Yuxia
> >
> > - 原始邮件 -
> > 发件人: 18579099...@163.com
> > 收件人: "user-zh" 
> > 发送时间: 星期二, 2022年 5 月 10日 上午 10:39:16
> > 主题: Re: Re: flink sql无法读取Hive映射的HBase表
> >
> > 版本:
> > flink:1.13.6
> > hive:2.1.1-cdh6.2.0
> > hbase:2.1.0-cdh6.2.0
> > flinksql执行工具:flink sql client
> > sql 提交模式:yarn-per-job
> >
> >
> -
> > flink lib目录下的包
> > antlr-runtime-3.5.2.jar
> > flink-csv-1.13.6.jar
> > flink-dist_2.11-1.13.6.jar
> > flink-json-1.13.6.jar
> > flink-shaded-zookeeper-3.4.14.jar
> > flink-sql-connector-hive-2.2.0_2.11-1.13.6.jar
> > flink-table_2.11-1.13.6.jar
> > flink-table-blink_2.11-1.13.6.jar
> > guava-14.0.1.jar
> > hadoop-mapreduce-client-core-3.0.0-cdh6.2.0.jar
> > hbase-client-2.1.0-cdh6.2.0.jar
> > hbase-common-2.1.0-cdh6.2.0.jar
> > hbase-protocol-2.1.0-cdh6.2.0.jar
> > hbase-server-2.1.0-cdh6.2.0.jar
> > hive-exec-2.1.1-cdh6.2.0.jar
> > hive-hbase-handler-2.1.1-cdh6.2.0.jar
> > htrace-core4-4.1.0-incubating.jar
> > log4j-1.2-api-2.17.1.jar
> > log4j-api-2.17.1.jar
> > log4j-core-2.17.1.jar
> > log4j-slf4j-impl-2.17.1.jar
> > protobuf-java-2.5.0.jar
> >
> >
> 
> > hive建表语句
> > CREATE EXTERNAL TABLE `ods.student`(
> >   `row_key` string,
> >   `name` string,
> >   `age` int,
> >   `addr` string
> > )
> > ROW FORMAT SERDE
> >   'org.apache.hadoop.hive.hbase.HBaseSerDe'
> > STORED BY
> >   'org.apache.hadoop.hive.hbase.HBaseStorageHandler'
> > WITH SERDEPROPERTIES (
> >
> >
> 'hbase.columns.mapping'=':key,FINAL:NAME,FINAL:AGE,FINAL:ADDR,'serialization.format'='1')
> > TBLPROPERTIES (
> >   'hbase.table.name'='ODS:STUDENT') ;
> > catalog:hive catalog
> > sql: select * from ods.student;
> > 我看了报错信息之后添加了一些jar包到flink lib下,之前报的错跟缺少依赖有关。现在又出现了新的错误。
> > [ERROR] Could not execute SQL statement. Reason:
> > org.apache.flink.connectors.hive.FlinkHiveException: Unable to
> instantiate
> > the hadoop input format
> >
> >
> --
> > 详细的堆栈
> > org.apache.flink.table.client.gateway.SqlExecutionException: Could not
> > execute SQL statement.
> > at
> >
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeOperation(LocalExecutor.java:215)
> > ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.gateway.local.LocalExecutor.executeQuery(LocalExecutor.java:235)
> > ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.callSelect(CliClient.java:479)
> > ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.callOperation(CliClient.java:412)
> > ~[flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.lambda$executeStatement$0(CliClient.java:327)
> > [flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at java.util.Optional.ifPresent(Optional.java:159) ~[?:1.8.0_191]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.executeStatement(CliClient.java:327)
> > [flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.executeInteractive(CliClient.java:297)
> > [flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at
> >
> org.apache.flink.table.client.cli.CliClient.executeInInteractiveMode(CliClient.java:221)
> > [flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at org.apache.flink.table.client.SqlClient.openCli(SqlClient.java:151)
> > [flink-sql-client_2.11-1.13.6.jar:1.13.6]
> > at org.apache.flink.table.client.SqlClient.start(SqlClient.java:95)
> > 

Re:[ANNOUNCE] Call for Presentations for ApacheCon Asia 2022 streaming track

2022-05-18 文章 hamster
退订
At 2022-05-18 19:47:47, "Yu Li"  wrote:
>Hi everyone,
>
>ApacheCon Asia [1] will feature the Streaming track for the second year.
>Please don't hesitate to submit your proposal if there is an interesting
>project or Flink experience you would like to share with us!
>
>The conference will be online (virtual) and the talks will be pre-recorded.
>The deadline of proposal submission is at the end of this month (May 31st).
>
>See you all there :)
>
>Best Regards,
>Yu
>
>[1] https://apachecon.com/acasia2022/cfp.html


[ANNOUNCE] Call for Presentations for ApacheCon Asia 2022 streaming track

2022-05-18 文章 Yu Li
Hi everyone,

ApacheCon Asia [1] will feature the Streaming track for the second year.
Please don't hesitate to submit your proposal if there is an interesting
project or Flink experience you would like to share with us!

The conference will be online (virtual) and the talks will be pre-recorded.
The deadline of proposal submission is at the end of this month (May 31st).

See you all there :)

Best Regards,
Yu

[1] https://apachecon.com/acasia2022/cfp.html


Re: 退订

2022-05-18 文章 Zhuoluo Yang
之信老师说得对

Thanks,
Zhuoluo


Jingsong Li  于2022年5月16日周一 09:58写道:

> 退订请回复到 user-zh-unsubscr...@flink.apache.org
>
> Best,
> Jingsong
>
> On Sun, May 15, 2022 at 1:04 PM cq <17691150...@163.com> wrote:
>
> > 退订
> >
> >
> >
> > Best Regards,
> >
> > Jacob.Q.Cao
> >
> >
> > TEL:17691150986
>


Re: 退订

2022-05-18 文章 Zhuoluo Yang
你到底需要之信老师给你讲多少遍,退订发个邮件到 user-zh-unsubscr...@flink.apache.org

Thanks,
Zhuoluo


孙洪龙  于2022年5月18日周三 10:39写道:

> 退订
>