T, Yadhunath
> Cc: user@flink.apache.org
> Subject: Re:Re: Re:Re: Event stuck in the Flink operator
>
> This Message Is From an Untrusted Sender
> "Caution" You have not previously corresponded with this sender. If you do
> not recognize this send, verify thei
Hi Xuyang,
Thanks for the reply!
I haven't used a print connector yet.
Thanks,
Yad
From: Xuyang
Sent: Monday, December 18, 2023 8:26 AM
To: T, Yadhunath
Cc: user@flink.apache.org
Subject: Re:Re: Re:Re: Event stuck in the Flink operator
This Message Is From
apache.org
Subject: Re:Re: Event stuck in the Flink operator
This Message Is From an Untrusted Sender
"Caution" You have not previously corresponded with this sender. If you do not
recognize this send, verify their identity offline. If the message appears
suspicious, please click the &
9:33 AM
To: user@flink.apache.org
Subject: Re:Re: Event stuck in the Flink operator
This Message Is From an Untrusted Sender
"Caution" You have not previously corresponded with this sender. If you do not
recognize this send, verify their identity offline. If the message appears
AM
To: T, Yadhunath
Cc: user@flink.apache.org ; Shames, Joy
; Maj, Marek ; Kornegay,
Robert ; Michael, Dennis
; Kerelli, Sharath
Subject: Re: Event stuck in the Flink operator
Can you share your precise join semantics? I don't know about Flink SQL
offhand, but here are a couple ways to do
Hi, Yad.
Can you share the smallest set of sql that can reproduce this problem?
BTW, the last flink operator you mean is the sink with kafka connector?
--
Best!
Xuyang
在 2023-12-15 04:38:21,"Alex Cruise" 写道:
Can you share your precise join semantics?
I don't know
Can you share your precise join semantics?
I don't know about Flink SQL offhand, but here are a couple ways to do this
when you're using the DataStream API:
* use the Session Window join
Hi,
I am using Flink version 1.16 and I have a streaming job that uses PyFlinkSQL
API.
Whenever a new streaming event comes in it is not getting processed in the last
Flink operator ( it performs temporal join along with writing data into Kafka
topic) and it will be only pushed to Kafka on