Github user Sephiroth-Lin commented on the pull request:

    https://github.com/apache/spark/pull/6409#issuecomment-105715873
  
    @tgravescs I have tested below:
    max retried is defaule, use yarn -kill to kill application when application 
start running, run SparkPi with parameter 20000.
    ---------------------------------------
    yarn-cluster: 
    YARN UI          AppMaster Log(add code to print the final status on 
ApplicationMaster.scala line 127)
    KILLED            FAILED
    ---------------------------------------
    yarn-client:
    YARN UI          AppMaster Log
    KILLED            UNDEFINED
    ---------------------------------------
    
    @vanzin yes, this may break application retries, we need to consider more, 
and I will try.
    @srowen @tgravescs @vanzin thank you.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to