Hi,
You are using a special Kafka connector. From the definition in website:
"as a sink, the upsert-kafka connector can consume a changelog stream. It will
write INSERT/UPDATE_AFTER data as normal Kafka messages value, and write DELETE
data as Kafka messages with null values (indicate tombstone
Hi,
I am stumbling on the next Flink SQL problem - but I am sure you can help
me :)
I have an extremely simple table called "bla" which just has one column of
type double. Now I want to sink that table into a Kafka topic. This is how
I do it:
CREATE TABLE bla_sink (
total DOUBLE,
PRIMARY KEY (t