Hi,

I'm a PhD student and I'm trying to model the performances (processing
delay, throughput) of Spark Streaming jobs, and I wonder whether there is a
way to do live migration of a Spark Streaming job from one configuration to
another?
(i.e. without having to interrupt the job and then re-submit using the
sparkSubmit command with the new configuration parameters).

Is it possible in both APIs (DStreams and Structured Streaming)?

Thank you!

Best,

Khaled.

Reply via email to