Hello, Eric
Maybe you can use  Spark JobServer 0.10.0 
https://github.com/spark-jobserver/spark-jobserverl 
We used this with Spark 1.6, and it is awesome. You know the project is still 
very active. So highly recommend it to you


 
Fusion Zhu


------------------------------------------------------------------
发件人:Eric Beabes <mailinglist...@gmail.com>
发送时间:2020年9月3日(星期四) 04:58
收件人:spark-user <user@spark.apache.org>
主 题:Submitting Spark Job thru REST API?

Under Spark 2.4 is it possible to submit a Spark job thru REST API - just like 
the Flink job?

Here's the use case: We need to submit a Spark Job to the EMR cluster but our 
security team is not allowing us to submit a job from the Master node or thru 
UI. They want us to create a "Docker Container" to submit a job.

If it's possible to submit the Spark job thru REST then we don't need to 
install Spark/Hadoop JARs on the Container. If it's not possible to use REST 
API, can we do something like this?

spark-2.4.6-bin-hadoop2.7/bin/spark-submit \
 --class myclass --master "yarn url" --deploy-mode cluster \
 
In other words, instead of --master yarn, specify a URL. Would this still work 
the same way? 

Attachment: logo-baidu-220X220.png
Description: Binary data

Attachment: upwork.png
Description: Binary data

Reply via email to