Qiao, Richard wrote
> Comparing #1 and #3, my understanding of “submitted” is “the jar is
> submitted to executors”. With this concept, you may define your own
> status.

In SparkLauncher, SUBMITTED means that the Driver was able to acquire cores
from Spark cluster and Launcher is waiting for Driver to connect back. Once
it connects back, the state of Driver is changed to CONNECTED.
As Marcelo mentioned, Launcher can only tell me about the Driver state and
it is not possible to guess the state of "application (executors)". For the
state of executors we can use SparkListener.

With the combination of both Launcher + Listener, I have a solution. As you
mentioned, that even if 1 executor is allocated to "application", the state
will change to RUNNING. So in my application, I change the status of my job
to RUNNING only if I receive RUNNING from Launcher and onExecuterAdded event
from SparkListener.



--
Sent from: http://apache-spark-user-list.1001560.n3.nabble.com/

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to