Hello,

I am submitting a spark job using SparkSubmit. When I kill my application,
it does not kill the corresponding spark job. How would I kill the
corresponding spark job? I know, one way is to use SparkSubmit again with
appropriate options. Is there any way though which I can tell SparkSubmit
at the time of job submission itself. Here is my code:

-
import org.apache.spark.deploy.SparkSubmit;
- class MyClass{
-
- public static void main(String args[]){
- //preparing args
- SparkSubmit.main(args);
- }
-
- }

Reply via email to