Aseem Bansal created SPARK-17742: ------------------------------------ Summary: Spark Launcher does not get failed state in Listener Key: SPARK-17742 URL: https://issues.apache.org/jira/browse/SPARK-17742 Project: Spark Issue Type: Bug Components: Spark Submit Affects Versions: 2.0.0 Reporter: Aseem Bansal
I tried to launch an application using the below code. This is dummy code to reproduce the problem. I tried exiting spark with status -1, throwing an exception etc. but in no case did the listener give me failed status. But if a spark job returns -1 or throws an exception from the main method it should be considered as a failure. {code} package com.example; import org.apache.spark.launcher.SparkAppHandle; import org.apache.spark.launcher.SparkLauncher; import java.io.IOException; public class Main2 { public static void main(String[] args) throws IOException, InterruptedException { SparkLauncher launcher = new SparkLauncher() .setSparkHome("/opt/spark2") .setAppResource("/home/aseem/projects/testsparkjob/build/libs/testsparkjob-1.0-SNAPSHOT.jar") .setMainClass("com.example.Main") .setMaster("local[2]"); launcher.startApplication(new MyListener()); Thread.sleep(1000 * 60); } } class MyListener implements SparkAppHandle.Listener { @Override public void stateChanged(SparkAppHandle handle) { System.out.println("state changed " + handle.getState()); } @Override public void infoChanged(SparkAppHandle handle) { System.out.println("info changed " + handle.getState()); } } {code} The spark job is {code} package com.example; import org.apache.spark.sql.SparkSession; import java.io.IOException; public class Main { public static void main(String[] args) throws IOException { SparkSession sparkSession = SparkSession .builder() .appName("" + System.currentTimeMillis()) .getOrCreate(); try { for (int i = 0; i < 15; i++) { Thread.sleep(1000); System.out.println("sleeping 1"); } } catch (InterruptedException e) { e.printStackTrace(); } // sparkSession.stop(); System.exit(-1); } } {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org