Github user erenavsarogullari commented on a diff in the pull request: https://github.com/apache/spark/pull/16905#discussion_r101164858 --- Diff: core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala --- @@ -130,15 +130,17 @@ private[spark] class TaskSchedulerImpl private[scheduler]( val mapOutputTracker = SparkEnv.get.mapOutputTracker - var schedulableBuilder: SchedulableBuilder = null + private val SCHEDULER_MODE_PROPERTY = "spark.scheduler.mode" + private var schedulableBuilder: SchedulableBuilder = null var rootPool: Pool = null // default scheduler is FIFO - private val schedulingModeConf = conf.get("spark.scheduler.mode", "FIFO") + private val schedulingModeConf = conf.get(SCHEDULER_MODE_PROPERTY, SchedulingMode.FIFO.toString) val schedulingMode: SchedulingMode = try { SchedulingMode.withName(schedulingModeConf.toUpperCase) } catch { case e: java.util.NoSuchElementException => - throw new SparkException(s"Unrecognized spark.scheduler.mode: $schedulingModeConf") + throw new SparkException(s"Unrecognized $SCHEDULER_MODE_PROPERTY: $schedulingModeConf. " + + s"Supported modes: ${SchedulingMode.FAIR} or ${SchedulingMode.FIFO}.") --- End diff -- `SchedulingMode` possible values are `FIFO`, `FAIR` and `NONE` but `NONE` is _unsupported_ value. I agree to support for potential values and it can be achieved by adding following logic to `SchedulingMode` object and can be used required places(2 times here and 1 time at `Pool`) `def getSupportedValuesAsString(): String = values.filter(_ != NONE).mkString(", ") SchedulingMode.getSupportedValuesAsString() // returns FIFO, FAIR` WDYT?
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org