Have a look at the window, updateStateByKey operations, if you are looking
for something more sophisticated then you can actually persists these
streams in an intermediate storage (say for x duration) like HBase or
Cassandra or any other DB and you can do global aggregations with these.

Thanks
Best Regards

On Wed, Jul 1, 2015 at 1:06 PM, Spark Enthusiast <sparkenthusi...@yahoo.in>
wrote:

> Hi,
>
> I have to build a system that reacts to a set of events. Each of these
> events are separate streams by themselves which are consumed from different
> Kafka Topics and hence will have different InputDStreams.
>
> Questions:
>
> Will I be able to do joins across multiple InputDStreams and collate the
> output using a single Accumulator?
> These Event Streams can have their own frequency of occurrence. How will I
> be able to co-ordinate the out of sync behaviour?
>

Reply via email to