Github user tgravescs commented on a diff in the pull request:

    https://github.com/apache/spark/pull/560#discussion_r13868874
  
    --- Diff: 
yarn/common/src/main/scala/org/apache/spark/deploy/yarn/ClientBase.scala ---
    @@ -342,24 +352,16 @@ trait ClientBase extends Logging {
           sparkConf.set("spark.driver.extraJavaOptions", opts)
         }
     
    +    // Forward the Spark configuration to the application master / 
executors.
         // TODO: it might be nicer to pass these as an internal environment 
variable rather than
         // as Java options, due to complications with string parsing of nested 
quotes.
    -    if (args.amClass == classOf[ExecutorLauncher].getName) {
    -      // If we are being launched in client mode, forward the spark-conf 
options
    -      // onto the executor launcher
    -      for ((k, v) <- sparkConf.getAll) {
    -        javaOpts += "-D" + k + "=" + "\\\"" + v + "\\\""
    --- End diff --
    
    export SPARK_JAVA_OPTS="-Dspark...=foo"   .  This needs to work for 
backwards compatibility but it doesn't with this code change because SparkConf 
throws an error saying you aren't supposed to do this.
    
    I agree it is cleaner the way you have it but its going to require more 
investigation and a bunch of testing to make sure it works properly and I would 
rather have it split off into another jira/pr.  There are already alot of 
things in this pr.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to