On Wed, Jan 3, 2018 at 8:18 PM, John Zhuge <john.zh...@gmail.com> wrote:
> Something like:
>
> Note: When running Spark on YARN, environment variables for the executors
> need to be set using the spark.yarn.executorEnv.[EnvironmentVariableName]
> property in your conf/spark-defaults.conf file or on the command line.
> Environment variables that are set in spark-env.sh will not be reflected in
> the executor process.

I'm not against adding docs, but that's probably true for all
backends. No backend I know sources spark-env.sh before starting
executors.

For example, the standalone worker sources spark-env.sh before
starting the daemon, and those env variables "leak" to the executors.
But you can't customize an individual executor's environment that way
without restarting the service.

-- 
Marcelo

---------------------------------------------------------------------
To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Reply via email to