Spark 0.9.0 does include standalone scheduler HA, but it requires running
multiple masters. The docs are located here:
https://spark.apache.org/docs/0.9.0/spark-standalone.html#high-availability

0.9.0 also includes driver HA (for long-running normal or streaming jobs),
allowing you to submit a driver into the standalone cluster which will be
restarted automatically if it crashes. That doc is on the same page:
https://spark.apache.org/docs/0.9.0/spark-standalone.html#launching-applications-inside-the-cluster

Please let me know if you have further questions.


On Mon, Mar 10, 2014 at 6:57 PM, qingyang li <liqingyang1...@gmail.com>wrote:

> is spark 0.9.0 HA?   we only have one master server , i think is is not .
> so, Does anyone know how to support HA for spark?
>

Reply via email to