[ 
https://issues.apache.org/jira/browse/SPARK-19264?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15828366#comment-15828366
 ] 

hustfxj commented on SPARK-19264:
---------------------------------

 Maybe you are right.  We can't hard-kill the driver. But I don't think it is a 
good design. I still think the driver should terminate if the main thread quit. 
Maybe we should make user know the issue on the documentation if the user's 
program has spawned non-daemon threads. Thank you.

> Work should start driver, the same to  AM  of yarn 
> ---------------------------------------------------
>
>                 Key: SPARK-19264
>                 URL: https://issues.apache.org/jira/browse/SPARK-19264
>             Project: Spark
>          Issue Type: Improvement
>            Reporter: hustfxj
>
>   I think work can't start driver by "ProcessBuilderLike",  thus we can't 
> know the application's main thread is finished or not if the application's 
> main thread contains some non-daemon threads. Because the program terminates 
> when there no longer is any non-daemon thread running (or someone called 
> System.exit). The main thread can have finished long ago. 
>     worker should  start driver like AM of YARN . As followed:
> {code:title=ApplicationMaster.scala|borderStyle=solid}    
>      mainMethod.invoke(null, userArgs.toArray)
>      finish(FinalApplicationStatus.SUCCEEDED, ApplicationMaster.EXIT_SUCCESS)
>      logDebug("Done running users class")
> {code}
> Then the work can monitor the driver's main thread, and know the 
> application's state. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to