Github user vanzin commented on a diff in the pull request:

    https://github.com/apache/spark/pull/22959#discussion_r237587099
  
    --- Diff: 
resource-managers/kubernetes/core/src/main/scala/org/apache/spark/deploy/k8s/KubernetesConf.scala
 ---
    @@ -112,125 +72,139 @@ private[spark] case class KubernetesConf[T <: 
KubernetesRoleSpecificConf](
       def getOption(key: String): Option[String] = sparkConf.getOption(key)
     }
     
    +private[spark] class KubernetesDriverConf(
    +    sparkConf: SparkConf,
    +    val appId: String,
    +    val mainAppResource: MainAppResource,
    +    val mainClass: String,
    +    val appArgs: Array[String],
    +    val pyFiles: Seq[String])
    +  extends KubernetesConf(sparkConf) {
    +
    +  override val resourceNamePrefix: String = {
    +    val custom = if (Utils.isTesting) 
get(KUBERNETES_DRIVER_POD_NAME_PREFIX) else None
    --- End diff --
    
    I'm trying to avoid creating custom test classes here (that's what I 
understand by "inject", since there's no way to "inject" this otherwise). 
There's really a single test that needs this functionality, IIRC, and this 
pattern is way more common in Spark than what you're suggesting.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to