Hello, I am trying out flink for one stream processing scenario and was wondering if it can be achieved using Apache Flink. So any pointers regarding how it can be achieved will be of great help.
Scenario :- A kafka topic has the input for stream processing, multiple applications lets say A & B would be publishing their message to the same topic (Topic X) with different keys (keys being application names). These messages are read by stream processing applications and processed eventually landing in sinks specific for A & B. The end result is to have this entire piece dynamic so that new applications C,D,E etc.. can be automatically accommodated. ATM i am able to figure out the kafka source and stream processing part. What I am not clear is incase of streaming would conditional multiple sinks work ? i.e. for Application A data lands into Sink A, Application B -> Sink B and so on . >From Implementation I could probably split the stream and pass those streams to respective tables. However all this needs to happen dynamically. Would Apache Flink be able to support this ? if yes how? I am using Apache Flink 1.17.1 with the pipeline written in Java Thank you in advance, Regards, -Yogesh