[ 
https://issues.apache.org/jira/browse/SPARK-22561?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Hyukjin Kwon updated SPARK-22561:
---------------------------------
    Labels: bulk-closed  (was: )

> Dynamically update topics list for spark kafka consumer
> -------------------------------------------------------
>
>                 Key: SPARK-22561
>                 URL: https://issues.apache.org/jira/browse/SPARK-22561
>             Project: Spark
>          Issue Type: New Feature
>          Components: DStreams
>    Affects Versions: 2.1.0, 2.1.1, 2.2.0
>            Reporter: Arun
>            Priority: Major
>              Labels: bulk-closed
>
> The Spark Streaming application should allow to add new topic after streaming 
> context is intialized and DStream is started.  This is very useful feature 
> specially when business is working multi geography or  multi business units. 
> For example initially I have spark-kakfa consumer listening for topics: 
> ["topic-1"."topic-2"] and after couple of days I have added new topics to 
> kafka ["topic-3","topic-4"], now is there a way to update spark-kafka 
> consumer topics list and ask spark-kafka consumer to consume data for updated 
> list of topics without stopping sparkStreaming application or sparkStreaming 
> context.



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to