Github user tgravescs commented on the pull request:

    https://github.com/apache/spark/pull/560#issuecomment-46603671
  
    so the example I gave above is on the master branch and the configs I set 
show up on both driver and executors.  I'm not concerned with configs that 
don't start with "spark." as those aren't spark configs.  When you say it 
doesn't show up on executors how are you checking?  You don't see the logs in 
the executors that I listed?  I ran on both 0.23 and 2.4 clusters.  So I'm 
wondering why its not working for you. 
    
     Note that most configs get converted into SparkConf and sent to the 
executor via akka when it registers so they won't show up in the processes line 
with -D's or via getProperty.  The security settings are special as they are 
needed before the registration happens.  
    
    Again I'm only concerned with actual spark configs (spark.*) and I'm only 
concerned about the spark framework properly reading them. I'm not concerned 
with application code reading them.
    
    Another example you can use is :
     export SPARK_JAVA_OPTS="-Dspark.authenticate=true 
-Dspark.ui.acls.enable=true -Dspark.akka.threads=10 
-Dspark.akka.logAkkaConfig=true"
    
    make sure that the akka settings get logged at the beginning of the 
executor processes.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to