[ https://issues.apache.org/jira/browse/SPARK-6735?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14608280#comment-14608280 ]
Thomas Graves commented on SPARK-6735: -------------------------------------- Pull request was up but didn't have time to do rework for some comments if someone else wants to take this over. https://github.com/apache/spark/pull/5449 > Provide options to make maximum executor failure count ( which kills the > application ) relative to a window duration or disable it. > ----------------------------------------------------------------------------------------------------------------------------------- > > Key: SPARK-6735 > URL: https://issues.apache.org/jira/browse/SPARK-6735 > Project: Spark > Issue Type: Improvement > Components: Spark Submit, YARN > Affects Versions: 1.2.0, 1.2.1, 1.3.0 > Reporter: Twinkle Sachdeva > > Currently there is a setting (spark.yarn.max.executor.failures ) which tells > maximum number of executor failures, after which Application fails. > For long running applications, user can require not to kill the application > at all or will require such setting relative to a window duration. This > improvement is ti provide such options to make maximum executor failure count > ( which kills the application ) relative to a window duration or disable it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org