Thanks for the reply. I also found that  sparkConf.setExecutorEnv works for 
yarn.

From: Saisai Shao <sai.sai.s...@gmail.com<mailto:sai.sai.s...@gmail.com>>
Date: Wednesday, February 17, 2016 at 6:02 PM
To: Lin Zhao <l...@exabeam.com<mailto:l...@exabeam.com>>
Cc: "user@spark.apache.org<mailto:user@spark.apache.org>" 
<user@spark.apache.org<mailto:user@spark.apache.org>>
Subject: Re: Yarn client mode: Setting environment variables

IIUC for example you want to set environment FOO=bar in executor side, you 
could use "spark.executor.Env.FOO=bar" in conf file, AM will pick this 
configuration and set as environment variable through container launching. Just 
list all the envs you want to set in executor side like spark.executor.xxx=xxx.

Thanks
Saisai

On Thu, Feb 18, 2016 at 3:31 AM, Lin Zhao 
<l...@exabeam.com<mailto:l...@exabeam.com>> wrote:
I've been trying to set some environment variables for the spark executors but 
haven't had much like. I tried editting conf/spark-env.sh but it doesn't get 
through to the executors. I'm running 1.6.0 and yarn, any pointer is 
appreciated.

Thanks,
Lin

Reply via email to