Hi, I am new in the usage of spark streaming. I have developed one spark streaming job which runs every 30 minutes with checkpointing directory.
I have to implement minor change, shall I kill the spark streaming job once the batch is completed using yarn application -kill command and update the jar file? Question I have is, if I follow the above approach will spark streaming picks up data from offset saved in checkpoint after restart? is there any other better approaches you have. Thanks in advance for your suggestions. Thanks, Asmath