Maybe there are other ways but I think the most common path is using Apache Livy (https://livy.apache.org/).

On 02/09/2020 17:58, Eric Beabes wrote:
Under Spark 2.4 is it possible to submit a Spark job thru REST API - just like the Flink job?

Here's the use case: We need to submit a Spark Job to the EMR cluster but our security team is not allowing us to submit a job from the Master node or thru UI. They want us to create a "Docker Container" to submit a job.

If it's possible to submit the Spark job thru REST then we don't need to install Spark/Hadoop JARs on the Container. If it's not possible to use REST API, can we do something like this?

spark-2.4.6-bin-hadoop2.7/bin/spark-submit  \
  --class myclass --master "yarn url" --deploy-mode cluster \
In other words, instead of --master yarn, specify a URL. Would this still work the same way?

Reply via email to