Github user erenavsarogullari commented on a diff in the pull request:

    https://github.com/apache/spark/pull/16905#discussion_r101173634
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ---
    @@ -130,15 +130,17 @@ private[spark] class TaskSchedulerImpl 
private[scheduler](
     
       val mapOutputTracker = SparkEnv.get.mapOutputTracker
     
    -  var schedulableBuilder: SchedulableBuilder = null
    +  private val SCHEDULER_MODE_PROPERTY = "spark.scheduler.mode"
    +  private var schedulableBuilder: SchedulableBuilder = null
       var rootPool: Pool = null
       // default scheduler is FIFO
    -  private val schedulingModeConf = conf.get("spark.scheduler.mode", "FIFO")
    +  private val schedulingModeConf = conf.get(SCHEDULER_MODE_PROPERTY, 
SchedulingMode.FIFO.toString)
       val schedulingMode: SchedulingMode = try {
         SchedulingMode.withName(schedulingModeConf.toUpperCase)
       } catch {
         case e: java.util.NoSuchElementException =>
    -      throw new SparkException(s"Unrecognized spark.scheduler.mode: 
$schedulingModeConf")
    +      throw new SparkException(s"Unrecognized $SCHEDULER_MODE_PROPERTY: 
$schedulingModeConf. " +
    +        s"Supported modes: ${SchedulingMode.FAIR} or 
${SchedulingMode.FIFO}.")
    --- End diff --
    
    Yep, `TaskSetManager` also uses `NONE` to override `schedulingMode` (from 
parent `Schedulable` trait). However, it does not use schedulingMode. I think 
if NONE is removed, then FIFO will be used as the default value(e.g. 
TaskManager), right?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to