GitHub user subrotosanyal opened a pull request:

    https://github.com/apache/spark/pull/13497

    Added a new State (FINISHED_UNKNOWN) for the listeners of SparkLauncher

    ## What changes were proposed in this pull request?
    This situation can happen when the LauncherConnection gets an exception 
while reading through the socket and terminating silently without notifying 
making the client/listener think that the job is still in previous state.
    The fix force sends a notification to client that the job finished with 
unknown status and let client handle it accordingly.
    
    
    ## How was this patch tested?
    Added a unit test.
    


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/subrotosanyal/spark 
SPARK-15652-handle-spark-submit-jvm-crash

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/13497.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #13497
    
----
commit 3d9b78f4656a2e2329ee014199e7cc4fe42ac375
Author: Subroto Sanyal <ssan...@datameer.com>
Date:   2016-06-03T10:58:51Z

    SPARK-15652 Added a new State (FINISHED_UNKNOW) which will be used if the 
LauncherConnection gets an exception while reading through the socket. The 
failure of reading through the socket could be crash of the Spark Submit JVM. 
The fix force sends a notification to client that the job finished with unknown 
status.

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to