n demand as needed.
>
>
> On Tuesday, April 20, 2021, 01:11:08 PM PDT, Ejaskhan S <
> iamejask...@gmail.com> wrote:
>
>
> Hi Ahmed,
>
> If you want to dynamically produce events to different topics and you have
> the logic to identify the target topics, y
Hi Ahmed,
If you want to dynamically produce events to different topics and you have
the logic to identify the target topics, you will be able to achieve this
in the following way.
- Suppose this is your event after the transformation logic(if any) :
EVENT.
- This is the target topic
Thanks Arvid for the reply.
Can you please elaborate a little bit on option 2 , if possible ?
We are looking for a similar option. Currently we are proceeding with
option 1.
Thanks Jessy for the question
On Mon, Mar 22, 2021, 7:27 PM Arvid Heise wrote:
> Hi Jessy,
>
> Can I add a new sink
Thanks Arvid for the reply.
Can you please elaborate a little bit on option 2 , if possible ?
Thanks
Jessy
On Mon, Mar 22, 2021, 7:27 PM Arvid Heise wrote:
> Hi Jessy,
>
> Can I add a new sink into the execution graph at runtime, for example : a
>> new Kafka producer , without restarting the
Yes Gordon, it's obviously gave me a starting point to think about.
On Wed, Feb 3, 2021, 12:02 PM Tzu-Li (Gordon) Tai
wrote:
> Hi,
>
> There is no out-of-box Flink source/sink connector for this, but it isn't
> unheard of that users have implemented something to support what you
> outlined.
>
>
events to
> Kafka(topic)/RabbitMQ(queue) without persisting it in the data store. Let
> Flink do all the processing and finally write to the data store.
>
> Thank you
> Raghavendar T S
> https://www.linkedin.com/in/raghavendar-ts
>
> On Wed, Feb 3, 2021 at 11:29 AM Ejaskhan
Team,
It's just a random thought.
Can I make the Flink application exposing a rest endpoint for the data
source? So a client could send data to this endpoint. Subsequently, Flink
processed this data and responded to the client application through the
endpoint, like a client-server model.
Thanks