[ https://issues.apache.org/jira/browse/SPARK-45937?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17787926#comment-17787926 ]
Cheng Pan commented on SPARK-45937: ----------------------------------- [~tgraves] sorry, I missed this ticket and opened a new one SPARK-45969, PR is ready for review https://github.com/apache/spark/pull/43863 > Fix documentation of spark.executor.maxNumFailures > -------------------------------------------------- > > Key: SPARK-45937 > URL: https://issues.apache.org/jira/browse/SPARK-45937 > Project: Spark > Issue Type: Bug > Components: Spark Core > Affects Versions: 3.5.0 > Reporter: Thomas Graves > Priority: Critical > > https://issues.apache.org/jira/browse/SPARK-41210 added support for > spark.executor.maxNumFailures on Kubernetes, it made this config generic and > deprecated the yarn version. This config isn't documented and defaults are > not documented. > > [https://github.com/apache/spark/commit/40872e9a094f8459b0b6f626937ced48a8d98efb] > \ > It also added {color:#0a3069}spark.executor.failuresValidityInterval.{color} > > {color:#0a3069}Both need to have default values specified for yarn and k8s, > it also needs to remove the yarn documentation for equivalent configs > spark.yarn.max.executor.failures configuration{color} > > > -- This message was sent by Atlassian Jira (v8.20.10#820010) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org