[ 
https://issues.apache.org/jira/browse/SPARK-26993?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Yifan Guo resolved SPARK-26993.
-------------------------------
    Resolution: Done

> _minRegisteredRatio default value is zero not 0.8 for Yarn
> ----------------------------------------------------------
>
>                 Key: SPARK-26993
>                 URL: https://issues.apache.org/jira/browse/SPARK-26993
>             Project: Spark
>          Issue Type: Question
>          Components: YARN
>    Affects Versions: 2.4.0
>            Reporter: Yifan Guo
>            Priority: Major
>
> private[spark]
> class CoarseGrainedSchedulerBackend(scheduler: TaskSchedulerImpl, val rpcEnv: 
> RpcEnv)
>  extends ExecutorAllocationClient with SchedulerBackend with Logging {
>  // Use an atomic variable to track total number of cores in the cluster for 
> simplicity and speed
>  protected val totalCoreCount = new AtomicInteger(0)
>  // Total number of executors that are currently registered
>  protected val totalRegisteredExecutors = new AtomicInteger(0)
>  protected val conf = scheduler.sc.conf
>  private val maxRpcMessageSize = RpcUtils.maxMessageSizeBytes(conf)
>  private val defaultAskTimeout = RpcUtils.askRpcTimeout(conf)
>  // Submit tasks only after (registered resources / total expected resources)
>  // is equal to at least this value, that is double between 0 and 1.
>  private val _minRegisteredRatio =
>  math.min(1, conf.getDouble("spark.scheduler.minRegister*edResourcesRatio", 
> 0))*
>  
> override val minRegisteredRatio =
>  if (conf.getOption("spark.scheduler.minRegisteredResourcesRatio").isEmpty) {
>  0.8
>  } else {
>  super.minRegisteredRatio
>  }
>  
> Apparently, if "spark.scheduler.minRegisteredResourcesRatio" is not 
> configured, default value is zero not 0.8
>  
> is that on purpose ? 
>  
>  
>  
>  
>  
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to