Github user ArtRand commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19631#discussion_r149456772
  
    --- Diff: 
core/src/main/scala/org/apache/spark/deploy/rest/RestSubmissionClient.scala ---
    @@ -398,9 +399,20 @@ private[spark] object RestSubmissionClient {
       val PROTOCOL_VERSION = "v1"
     
       /**
    -   * Submit an application, assuming Spark parameters are specified 
through the given config.
    -   * This is abstracted to its own method for testing purposes.
    +   * Filter non-spark environment variables from any environment.
        */
    +  private[rest] def filterSystemEnvironment(env: Map[String, String]): 
Map[String, String] = {
    +    env.filterKeys { k =>
    +      // SPARK_HOME is filtered out because it is usually wrong on the 
remote machine (SPARK-12345)
    +      (k.startsWith("SPARK_") && k != "SPARK_ENV_LOADED" && k != 
"SPARK_HOME") ||
    +        k.startsWith("MESOS_")
    --- End diff --
    
    Yes, I apologize you're correct. I think this is actually to filter out 
things like `MESOS_EXECUTOR_ID` and `MESOS_FRAMEWORK_ID`


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to