liupengcheng created SPARK-26941:
------------------------------------

             Summary: maxNumExecutorFailures should be computed with 
spark.streaming.dynamicAllocation.maxExecutors in streaming 
                 Key: SPARK-26941
                 URL: https://issues.apache.org/jira/browse/SPARK-26941
             Project: Spark
          Issue Type: Bug
          Components: Spark Core
    Affects Versions: 2.4.0, 2.1.0
            Reporter: liupengcheng


Currently, when enabled streaming dynamic allocation for streaming 
applications, the maxNumExecutorFailures in ApplicationMaster is still computed 
with `spark.dynamicAllocation.maxExecutors`. 

Actually, we should consider `spark.streaming.dynamicAllocation.maxExecutors` 
instead.

Related codes:
{code:java}
private val maxNumExecutorFailures = {
  val effectiveNumExecutors =
    if (Utils.isStreamingDynamicAllocationEnabled(sparkConf)) {
      sparkConf.get(STREAMING_DYN_ALLOCATION_MAX_EXECUTORS)
    } else if (Utils.isDynamicAllocationEnabled(sparkConf)) {
      sparkConf.get(DYN_ALLOCATION_MAX_EXECUTORS)
    } else {
      sparkConf.get(EXECUTOR_INSTANCES).getOrElse(0)
    }
  // By default, effectiveNumExecutors is Int.MaxValue if dynamic allocation is 
enabled. We need
  // avoid the integer overflow here.
  val defaultMaxNumExecutorFailures = math.max(3,
    if (effectiveNumExecutors > Int.MaxValue / 2) Int.MaxValue else (2 * 
effectiveNumExecutors))

  sparkConf.get(MAX_EXECUTOR_FAILURES).getOrElse(defaultMaxNumExecutorFailures)
{code}



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to