If you all can say a little more about what your requirements are, maybe we
can get a jira together.

I think the easiest way to deal with this currently is to start a new job
before stopping the old one, which should prevent latency problems.

On Thu, Aug 27, 2015 at 9:24 AM, Sudarshan Kadambi (BLOOMBERG/ 731 LEX) <
skada...@bloomberg.net> wrote:

> This is something we have been needing for a while too. We are restarting
> the streaming context to handle new topic subscriptions & unsubscriptions
> which affects latency of update handling. I think this is something that
> needs to be addressed in core Spark Streaming (I can't think of any
> fundamental limitations that prevent this, perhaps nobody has just
> expressed interest in this feature so far?).
>
> From: yael.aharo...@gmail.com At: Aug 27 2015 10:19:33
> To: user@spark.apache.org
> Subject: Re:Adding Kafka topics to a running streaming context
>
> Hello,
> My streaming application needs to allow consuming new Kafka topics at
> arbitrary times. I know I can stop and start the streaming context when I
> need to introduce a new stream, but that seems quite disruptive. I am
> wondering if other people have this situation and if there is a more
> elegant solution?
> thanks, Yael
>
>
>

Reply via email to