[ https://issues.apache.org/jira/browse/SPARK-16194?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15348643#comment-15348643 ]
Marcelo Vanzin commented on SPARK-16194: ---------------------------------------- For YARN you have {{spark.yarn.appMasterEnv.ENV_VAR}} just like for the executor env. Not sure if other cluster managers have anything like that. > No way to dynamically set env vars on driver in cluster mode > ------------------------------------------------------------ > > Key: SPARK-16194 > URL: https://issues.apache.org/jira/browse/SPARK-16194 > Project: Spark > Issue Type: Improvement > Affects Versions: 2.0.0 > Reporter: Michael Gummelt > Priority: Minor > > I often need to dynamically configure a driver when submitting in cluster > mode, but there's currently no way of setting env vars. {{spark-env.sh}} > lets me set env vars, but I have to statically build that into my spark > distribution. I need a solution for specifying them in {{spark-submit}}. > Much like {{spark.executorEnv.[ENV]}}, but for drivers. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org