Hello,

I have a driver deployed using `spark-submit` in supervised cluster mode.
Sometimes my application would die for some transient problem and the
restart works perfectly. However, it would be useful to get alerted when
that happens. Is there any out-of-the-box way of doing that? Perhaps a hook
that I can use to catch an event? I guess I could poll my application state
using Spark REST API, but if there was something more elegant, I would
rather use it.

Thanks in advance,
Rafael Barreto

Reply via email to