Hi all, I want to write a Spark Streaming program that listens to Kafka for a list of topics. The list of topics that I want to consume is stored in a DB and might change dynamically. I plan to periodically refresh this list of topics in the Spark Streaming app.
My question is is it possible to add/remove a Kafka topic that is consumed by a stream, or probably create a new stream at runtime? Would I need to stop/start the program or is there any other way to do this? Thanks! Nisrina