Thanks a lot Bowen.
I've started reading these docs. Really helpful. It's a good description of
the Hive integration in Flink and how to use it.
I continue my dev.
See you soon


Le lun. 12 août 2019 à 20:55, Bowen Li <bowenl...@gmail.com> a écrit :

> Hi David,
>
> Check out Hive related documentations:
>
> -
> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/catalog.html
> -
> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/hive_integration.html
> -
> https://ci.apache.org/projects/flink/flink-docs-master/dev/table/hive_integration_example.html
>
> Note
> - I just merged a PR restructuring Hive related docs today, changes should
> reflect on the website in a day or so
> - I didn't find release-1.9-snapshot's doc, so just reference
> release-1.10-snapshot's doc for now. 1.9 rc2 has been released, official
> 1.9 should be out soon
> - Hive features are in beta in 1.9
>
> Feel free to open tickets if you have feature requests.
>
>
> On Fri, Aug 9, 2019 at 8:00 AM David Morin <morin.david....@gmail.com>
> wrote:
>
>> Hi,
>>
>> I want to connect my Flink streaming job to Hive.
>> At the moment, what is the best way to connect to Hive.
>> Some features seems to be in development.
>> Some really cool features have been described here:
>> https://fr.slideshare.net/BowenLi9/integrating-flink-with-hive-xuefu-zhang-and-bowen-li-seattle-flink-meetup-feb-2019
>> My first need is to read and update Hive metadata.
>> Concerning the Hive data I can store them directly in HDFS (as Orc
>> format) in a first step.
>> thx.
>>
>> David
>>
>>

Reply via email to