Github user susanxhuynh commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19543#discussion_r150419093
  
    --- Diff: 
resource-managers/mesos/src/main/scala/org/apache/spark/deploy/mesos/config.scala
 ---
    @@ -122,4 +122,11 @@ package object config {
             "Example: key1:val1,key2:val2")
           .stringConf
           .createOptional
    +
    +  private[spark] val DRIVER_CONSTRAINTS =
    +    ConfigBuilder("spark.mesos.driver.constraints")
    +      .doc("Attribute based constraints on mesos resource offers. Applied 
by the dispatcher " +
    +        "when launching drivers. Default is to accept all offers with 
sufficient resources.")
    +      .stringConf
    +      .createWithDefault("")
    --- End diff --
    
    @felixcheung I think it's okay. There's a utility function, 
`parseConstraintString`, which parses the input string into a Map. If the 
string is empty, it returns an empty Map:
    
https://github.com/apache/spark/blob/fc45c2c88a838b8f46659ebad2a8f3a9923bc95f/resource-managers/mesos/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosSchedulerUtils.scala#L321
    This is also consistent with how the executor constraint string is parsed.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to