Hi there,

I have read about the two fundamental shared features in spark
(broadcasting variables and accumulators), but this is what i need.

I'm using spark streaming in order to get requests from Kafka, these
requests may launch long-running tasks, and i need to control them:

1) Keep them in a shared bag, like a Hashmap, to retrieve them by ID, for
example.
2) Retrieve an instance of this object/task whatever on-demand (on-request,
in fact)


Any idea about that? How can i share objects between slaves? May i use
something out of spark (maybe hazelcast')


Regards.

Reply via email to