Github user skonto commented on a diff in the pull request:

    https://github.com/apache/spark/pull/14167#discussion_r70779638
  
    --- Diff: 
core/src/main/scala/org/apache/spark/scheduler/cluster/mesos/MesosClusterScheduler.scala
 ---
    @@ -353,38 +353,60 @@ private[spark] class MesosClusterScheduler(
         }
       }
     
    -  private def buildDriverCommand(desc: MesosDriverDescription): 
CommandInfo = {
    -    val appJar = CommandInfo.URI.newBuilder()
    -      
.setValue(desc.jarUrl.stripPrefix("file:").stripPrefix("local:")).build()
    -    val builder = CommandInfo.newBuilder().addUris(appJar)
    -    val entries = conf.getOption("spark.executor.extraLibraryPath")
    -      .map(path => Seq(path) ++ desc.command.libraryPathEntries)
    -      .getOrElse(desc.command.libraryPathEntries)
    -
    -    val prefixEnv = if (!entries.isEmpty) {
    -      Utils.libraryPathEnvPrefix(entries)
    -    } else {
    -      ""
    +  private def getDriverExecutorURI(desc: MesosDriverDescription) = {
    +    desc.schedulerProperties.get("spark.executor.uri")
    +      .orElse(desc.command.environment.get("SPARK_EXECUTOR_URI"))
    +  }
    +
    +  private def getDriverEnvironment(desc: MesosDriverDescription): 
Environment = {
    +    val env = {
    +      val executorOpts = desc.schedulerProperties.map { case (k, v) => 
s"-D$k=$v" }.mkString(" ")
    +      val executorEnv = Map("SPARK_EXECUTOR_OPTS" -> executorOpts)
    +
    +      val prefix = "spark.mesos.env."
    --- End diff --
    
    Spark does not have a  spark.driverEnv.[EnvironmentVariableName] similar to 
spark.executorEnv.[EnvironmentVariableName] 
http://spark.apache.org/docs/latest/configuration.html. 
    From a UX experience and name consistency view i would expect something 
like that. The problem is that this pr only handles the mesos case so we cannot 
rename it to that name (unless we explicitly defined it in docs), also in 
client mode you do not need that so you will need to ignore it.
    "spark.mesos.env." needs to be more specific like spark.mesos.driver.env 
but as i said it only makes sense in cluster mode and  spark.driverEnv seems 
more appropriate.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to