Once you start your streaming application to read from Kafka, it will
launch receivers on the executor nodes. And you can see them on the
streaming tab of your driver ui (runs on 4040).

[image: Inline image 1]

These receivers will be fixed till the end of your pipeline (unless its
crashed etc.) You can say, eah receiver will run on a single core.

Thanks
Best Regards

On Wed, Apr 15, 2015 at 3:46 PM, Shushant Arora <shushantaror...@gmail.com>
wrote:

> Hi
>
> I want to understand the flow of spark streaming with kafka.
>
> In spark Streaming is the executor nodes at each run of streaming interval
> same or At each stream interval cluster manager assigns new executor nodes
> for processing this batch input. If yes then at each batch interval new
> executors register themselves as kafka consumers?
>
> Even without kafka is executor nodes on each batch interval same or driver
> nodes gets new executor nodes from cluster manager ?
>
> Thanks
> Shushant
>

Reply via email to