Hi, I am using Flink version 1.16 and I have a streaming job that uses PyFlinkSQL API. Whenever a new streaming event comes in it is not getting processed in the last Flink operator ( it performs temporal join along with writing data into Kafka topic) and it will be only pushed to Kafka on the arrival of the next streaming event. It is like the last operator needs an event to process the previous event. Did anyone experience a similar issue? Really appreciate if someone could advise a solution for this. Please let me know if you require more input.
Thanks, Yad