Github user ifilonenko commented on a diff in the pull request:

    https://github.com/apache/spark/pull/21092#discussion_r182567449
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/features/BasicDriverFeatureStep.scala
 ---
    @@ -88,15 +94,22 @@ private[spark] class BasicDriverFeatureStep(
             .addToRequests("memory", driverMemoryQuantity)
             .addToLimits("memory", driverMemoryQuantity)
             .endResources()
    -      .addToArgs("driver")
    +      .addToArgs(driverDockerContainer)
           .addToArgs("--properties-file", SPARK_CONF_PATH)
           .addToArgs("--class", conf.roleSpecificConf.mainClass)
    -      // The user application jar is merged into the spark.jars list and 
managed through that
    -      // property, so there is no need to reference it explicitly here.
    -      .addToArgs(SparkLauncher.NO_RESOURCE)
    -      .addToArgs(conf.roleSpecificConf.appArgs: _*)
    -      .build()
     
    +    val driverContainer =
    +      if (driverDockerContainer == "driver-py") {
    --- End diff --
    
    We can check the appResource but that was already done. I thought it would 
be overkill to check twice since it was already handled in setting 
`driverDockerContainer`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to