Hi Shipeng,
it looks like there is an open Jira issue FLINK-18202 [1] addressing this
topic. You might want to follow up on that one. I'm adding Timo and Jark to
this thread. They might have more insights.

Best,
Matthias

[1] https://issues.apache.org/jira/browse/FLINK-18202

On Sat, May 1, 2021 at 2:00 AM Fuyao Li <fuyao...@oracle.com> wrote:

> Hello Shipeng,
>
>
>
> I am not an expert in Flink, just want to share some of my thoughts. Maybe
> others can give you better ideas.
>
> I think there is no directly available Protobuf support for Flink SQL.
> However, you can write a user-defined format to support it [1].
>
> If you use DataStream API, you can leverage Kryo Serializer to serialize
> and deserialize with Protobuf format. [2]. There is an out-of-box
> integration for Protobuf here. You will need to convert it to Flink SQL
> after data ingestion.
>
>
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/sourceSinks.html#user-defined-sources-sinks
>
> [2]
> https://ci.apache.org/projects/flink/flink-docs-stable/dev/custom_serializers.html
>
>
>
> Best,
>
> Fuyao
>
>
>
>
>
> *From: *Shipeng Xie <xship...@vmware.com>
> *Date: *Friday, April 30, 2021 at 14:58
> *To: *user@flink.apache.org <user@flink.apache.org>
> *Subject: *[External] : Protobuf support with Flink SQL and Kafka
> Connector
>
> Hi,
>
>
>
> In
> https://ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/formats/
> <https://urldefense.com/v3/__https:/ci.apache.org/projects/flink/flink-docs-release-1.12/dev/table/connectors/formats/__;!!GqivPVa7Brio!P98PmXGjSsCo9MVOyHDZg39FQfaYhAKXwkO6xMz5STZfl95lJFWQDT7IOH2pa9w$>,
> it does not mention protobuf format. Does Flink SQL support protobuf
> format? If not, is there any plan to support it in the near future?
>
> Thanks!
>

Reply via email to