Re: Spark Streaming | Dynamic Action Support

2022-03-03 Thread Mich Talebzadeh
In short, I don't think there is such a possibility. However, there is the option of shutting down spark gracefully with checkpoint directory enabled. In such a way you can re-submit the modified code which will pick up BatchID from where it was left off, assuming the topic is the same. See the

Spark Streaming | Dynamic Action Support

2022-03-03 Thread Pappu Yadav
Hi, Is there any way I can add/delete actions/jobs dynamically in a running spark streaming job. I will call an API and execute only the configured actions in the system. Eg . In the first batch suppose there are 5 actions in the spark application. Now suppose some configuration is changed and