sarutak commented on a change in pull request #33777:
URL: https://github.com/apache/spark/pull/33777#discussion_r690930698



##########
File path: docs/configuration.md
##########
@@ -3075,7 +3075,7 @@ to use on each machine and maximum memory.
 Since `spark-env.sh` is a shell script, some of these can be set 
programmatically -- for example, you might
 compute `SPARK_LOCAL_IP` by looking up the IP of a specific network interface.
 
-Note: When running Spark on YARN in `cluster` mode, environment variables need 
to be set using the `spark.yarn.appMasterEnv.[EnvironmentVariableName]` 
property in your `conf/spark-defaults.conf` file.  Environment variables that 
are set in `spark-env.sh` will not be reflected in the YARN Application Master 
process in `cluster` mode.  See the [YARN-related Spark 
Properties](running-on-yarn.html#spark-properties) for more information.

Review comment:
       Hmm, env vars in `spark-env.sh` are still not reflected in the YARN AM 
right?




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org



---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to