Hi Sumeet,
I think you might first convert the table back to the DataStream [1],
then define the timestamp and watermark with
`assignTimestampsAndWatermarks(...)`,
and then convert it back to table[2].
Best,
Yun
[1]
https://ci.apache.org/projects/flink/flink-docs-master/docs/dev/table/common/
Hi,
My use case involves reading raw data records from Kafka and processing
them. The records are coming from a database, where a periodic job reads
new rows, packages them into a single JSON object (as described below) and
writes the entire record to Kafka.
{
'id': 'some_id',
'key_a': 'v