Hi,

I have a Spark cluster using standalone mode. Spark Master is
configured as High Availablity mode.
Now I am going to upgrade Spark from 1.0 to 1.1, but don't want to
interrupt the currently running jobs.

(1) Are there any way to perform a rolling upgrade (while running a job)?
(2) If not, when using YARN as a cluster manager, can I perform a
rolling upgrade?

Thanks,

Kenichi

-- 
Kenichi Maehashi <webmas...@kenichimaehashi.com>

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to