Spark Streaming | Dynamic Action Support

2022-03-03 Thread Pappu Yadav
Hi,

Is there any way I can add/delete actions/jobs dynamically in a running
spark streaming job.
I will call an API and execute only the configured actions in the system.

Eg . In the first batch suppose there are 5 actions in the spark
application.
Now suppose some configuration is changed and one action is added and one
is deleted.
How can i handle this in the spark streaming job without restarting the
application


Re: Spark Streaming | Dynamic Action Support

2022-03-03 Thread Mich Talebzadeh
In short, I don't think there is such a possibility. However, there is the
option of shutting down spark gracefully with checkpoint directory enabled.
In such a way you can  re-submit the modified code which will pick up
BatchID from where it was left off, assuming the topic is the same. See the
thread
"How to gracefully shutdown Spark Structured Streaming" in
https://lists.apache.org/list.html?user@spark.apache.org

HTH



   view my Linkedin profile



 https://en.everybodywiki.com/Mich_Talebzadeh



*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Thu, 3 Mar 2022 at 15:49, Mich Talebzadeh 
wrote:

> What is the definition of action here?
>
>
>
>view my Linkedin profile
> 
>
>
>  https://en.everybodywiki.com/Mich_Talebzadeh
>
>
>
> *Disclaimer:* Use it at your own risk. Any and all responsibility for any
> loss, damage or destruction of data or any other property which may arise
> from relying on this email's technical content is explicitly disclaimed.
> The author will in no case be liable for any monetary damages arising from
> such loss, damage or destruction.
>
>
>
>
> On Thu, 3 Mar 2022 at 10:56, Pappu Yadav  wrote:
>
>> Hi,
>>
>> Is there any way I can add/delete actions/jobs dynamically in a running
>> spark streaming job.
>> I will call an API and execute only the configured actions in the system.
>>
>> Eg . In the first batch suppose there are 5 actions in the spark
>> application.
>> Now suppose some configuration is changed and one action is added and one
>> is deleted.
>> How can i handle this in the spark streaming job without restarting the
>> application
>>
>