Hello Behroz,

you can use a SparkListener to get updates from the underlying process (c.f.
https://spark.apache.org/docs/2.2.0/api/java/org/apache/spark/scheduler/SparkListener.html
)

You need first to create your own SparkAppListener class that extends it:
-------------------------------------
private static class SparkAppListener implements SparkAppHandle.Listener,
Runnable {

        SparkAppListener() {}

        @Override
        public void stateChanged(SparkAppHandle handle) {
            String sparkAppId = handle.getAppId();
            SparkAppHandle.State appState = handle.getState();
            log.info("Spark job with app id: " + sparkAppId + ", State
changed to: " + appState);
        }

        @Override
        public void infoChanged(SparkAppHandle handle) {}

        @Override
        public void run() {}
    }
-----------------------------------------



Then you can run it in a thread via a Executors.newCachedThreadPool (or with
a simple New Thread(<your thread>))
-----------------------------------------
private final static ExecutorService listenerService =
Executors.newCachedThreadPool();

SparkAppListener appListener = new SparkAppListener();
listenerService.execute(appListener);

SparkLauncher command = new SparkLauncher()
                .setAppName(appName)
                .setSparkHome(sparkHome)
                .setAppResource(appResource)
                .setMainClass(mainClass)
                .setMaster(master) .........

SparkAppHandle appHandle = launcher.startApplication(appListener);
-----------------------------------------

At this point, every time the state changes, you will execute the
SparkAppListener.stateChanged method.

Hope it helps,
Davide



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to