I have an existing plain-Java (non-Spark) application that needs to run in
a fault-tolerant way, i.e. if the node crashes then the application is
restarted on another node, and if the application crashes because of
internal fault, the application is restarted too.

Normally I would run it in a Kubernetes, but in this specific case
Kubernetes is unavailable because of organizational/bureaucratic issues,
and the only execution platform available in the domain is Spark.

Is it possible to wrap the application into a Spark-based launcher that
will take care of executing the application and restarts?

Execution must be in a separate JVM, apart from other apps.

And for optimum performance, the application also needs to be assigned
guaranteed resources, i.e. the number of cores and amount of RAM required
for it, so it would be great if the launcher could take care of this too.

Thanks for advice.

Reply via email to