Hi,

I am wondering if it is possible to submit, monitor & kill spark applications 
from another service.

I have wrote a service this:

parse user commands
translate them into understandable arguments to an already prepared Spark-SQL 
application
submit the application along with arguments to Spark Cluster using spark-submit 
from ProcessBuilder
run generated applications' driver in cluster mode.
The above 4 steps has been finished, but I have difficulties in these two:

Query about the applications status, for example, the percentage completion.
Kill queries accordingly
What I find in spark standalone documentation suggest kill application using:

./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
And should find the driver ID through the standalone Master web UI at 
http://<master url>:8080.

Are there any programmatically methods I could get the driverID submitted by my 
`ProcessBuilder` and query status about the query?

Any Suggestions?

— 
Best Regards!
Yijie Shen

Reply via email to