Can I cut a steak with a hammer? Sure you can, but the steak would taste awful

Do you have organizational/bureaucratic issues with using a Load Balancer? 
Because that’s what you really need. Run your application on multiple nodes 
with a load balancer in front. When a node crashes, the load balancer will 
shift the traffic to the healthy node until the crashed node recovers.

From: Sergey Oboguev <obog...@gmail.com>
Date: Friday, March 12, 2021 at 2:53 PM
To: User <user@spark.apache.org>
Subject: [EXTERNAL] Using Spark as a fail-over platform for Java app


CAUTION: This email originated from outside of the organization. Do not click 
links or open attachments unless you can confirm the sender and know the 
content is safe.


I have an existing plain-Java (non-Spark) application that needs to run in a 
fault-tolerant way, i.e. if the node crashes then the application is restarted 
on another node, and if the application crashes because of internal fault, the 
application is restarted too.

Normally I would run it in a Kubernetes, but in this specific case Kubernetes 
is unavailable because of organizational/bureaucratic issues, and the only 
execution platform available in the domain is Spark.

Is it possible to wrap the application into a Spark-based launcher that will 
take care of executing the application and restarts?

Execution must be in a separate JVM, apart from other apps.

And for optimum performance, the application also needs to be assigned 
guaranteed resources, i.e. the number of cores and amount of RAM required for 
it, so it would be great if the launcher could take care of this too.

Thanks for advice.

Reply via email to