Hi.

Yes. Flink supports to write the value to the Kafka record key parts. You
just need to specify which column belongs to the key in the WITH blocks,
e.g.

```
CREATE TABLE kafka_sink (
...
) WITH (
   `key.fields` = 'id'
);
```

[1]
https://nightlies.apache.org/flink/flink-docs-master/docs/connectors/table/kafka/#key-fields

Dhavan Vaidya <dhavan.vai...@kofluence.com> 于2022年5月17日周二 19:16写道:

> Hey wang!
>
> Perhaps this is what you want:
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/kafka/#key-format
> &
> https://nightlies.apache.org/flink/flink-docs-release-1.14/docs/connectors/table/kafka/#key-fields
> ?
>
> Note that the fields *have* to be one of the "top" level columns of your
> sink table (i.e., fields inside Row are not supported, at least in PyFlink).
>
> Thanks!
>
> On Mon, 16 May 2022 at 19:33, wang <24248...@163.com> wrote:
>
>> Hi dear engineer,
>>
>> Flink sql supports kafka sink table, not sure whether it supports kafka
>> key in kafka sink table? As I want to specify kafka key when inserting
>> data into kafka sink table.
>> Thanks for your answer in advance.
>>
>>
>>
>> Thanks && Regards,
>> Hunk
>>
>>
>>
>>
>

Reply via email to