If you want to notify after every batch is completed, then you can simply
implement the StreamingListener
<https://spark.apache.org/docs/1.3.1/api/scala/index.html#org.apache.spark.streaming.scheduler.StreamingListener>
interface, which has methods like onBatchCompleted, onBatchStarted etc in
which you can push the configuration changes to zookeeper. This piece of
code runs on the driver of course. And in the task code, you can pull these
configuration changes from zookeeper.

Thanks
Best Regards

On Mon, May 25, 2015 at 7:45 AM, bit1...@163.com <bit1...@163.com> wrote:

> Can someone please help me on this?
>
> ------------------------------
> bit1...@163.com
>
>
> *发件人:* bit1...@163.com
> *发送时间:* 2015-05-24 13:53
> *收件人:* user <user@spark.apache.org>
> *主题:* How to use zookeeper in Spark Streaming
>
> Hi,
> In my spark streaming application, when the application starts and get
> running, the Tasks running on the Worker nodes need to be notified that
> some configurations have been changed from time to time, these
> configurations reside on the Zookeeper.
>
> My question is, where should I put the code that works with Zookeeper for
> the configuration change, in the Driver code or in the Task code? Is there
> some guide on this? Thanks.
>
>
>
>
>
> ------------------------------
> bit1...@163.com
>
>

Reply via email to