Hi Team,

    I'm working on a Flink Streaming application. The data is injected
through Kafka connectors. The payload volume is roughly 100K/sec. The event
payload is a string. Let's call this as DataStream1.
This application also uses another DataStream, call it DataStream2,
(consumes events off a kafka topic). The elements of this DataStream2
involves in a certain transformation that finally updates a Hashmap(/Java
util Collection). Apparently the flink application should share this
HashMap across the flink cluster so that DataStream1 application could
check the state of the values in this collection. Is there a way to do this
in Flink?

    I don't see any Shared Collection used within the cluster?

Best Regards
CVP

Reply via email to