Re: Submit Kill Spark Application program programmatically from another application

2015-05-03 Thread Chester Chen
Sounds like you are in Yarn-Cluster mode.

I created a JIRA SPARK-3913
https://issues.apache.org/jira/browse/SPARK-3913 and PR
https://github.com/apache/spark/pull/2786

is this what you looking for ?




Chester

On Sat, May 2, 2015 at 10:32 PM, Yijie Shen henry.yijies...@gmail.com
wrote:

 Hi,

 I’ve posted this problem in user@spark but find no reply, therefore moved
 to dev@spark, sorry for duplication.

 I am wondering if it is possible to submit, monitor  kill spark
 applications from another service.

 I have wrote a service this:

 parse user commands
 translate them into understandable arguments to an already prepared
 Spark-SQL application
 submit the application along with arguments to Spark Cluster
 using spark-submit from ProcessBuilder
 run generated applications' driver in cluster mode.
 The above 4 steps has been finished, but I have difficulties in these two:

 Query about the applications status, for example, the percentage
 completion.
 Kill queries accordingly
 What I find in spark standalone documentation suggest kill application
 using:

 ./bin/spark-class org.apache.spark.deploy.Client kill master url driver
 ID

 And should find
 the driver ID through the standalone Master web UI at
 http://master url:8080.

 Are there any programmatically methods I could get the driverID submitted
 by my `ProcessBuilder` and query status about the query?

 Any Suggestions?

 —
 Best Regards!
 Yijie Shen


Submit Kill Spark Application program programmatically from another application

2015-05-02 Thread Yijie Shen
Hi,

I’ve posted this problem in user@spark but find no reply, therefore moved to 
dev@spark, sorry for duplication.

I am wondering if it is possible to submit, monitor  kill spark applications 
from another service.

I have wrote a service this:

parse user commands
translate them into understandable arguments to an already prepared Spark-SQL 
application
submit the application along with arguments to Spark Cluster using spark-submit 
from ProcessBuilder
run generated applications' driver in cluster mode.
The above 4 steps has been finished, but I have difficulties in these two:

Query about the applications status, for example, the percentage completion.
Kill queries accordingly
What I find in spark standalone documentation suggest kill application using:

./bin/spark-class org.apache.spark.deploy.Client kill master url driver ID

And should find
the driver ID through the standalone Master web UI at
http://master url:8080.

Are there any programmatically methods I could get the driverID submitted by my 
`ProcessBuilder` and query status about the query?

Any Suggestions?

— 
Best Regards!
Yijie Shen