[ https://issues.apache.org/jira/browse/SPARK-7706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14548203#comment-14548203 ]
Shaik Idris Ali commented on SPARK-7706: ---------------------------------------- Hi, [~srowen], Thanks for the quick response, sorry I did not get, basically the way actions are launched in oozie or any other scheduler is from a Java program. Which takes the Main class and bunch of arguments to that class. Ex: org.apache.spark.deploy.SparkSubmit.main(args); and we do not require to set anything in System EVN variables. Link to Oozie code: https://github.com/apache/oozie/blob/master/sharelib/spark/src/main/java/org.apache.oozie.action.hadoop/SparkMain.java#L104 > Allow setting YARN_CONF_DIR from spark argument > ----------------------------------------------- > > Key: SPARK-7706 > URL: https://issues.apache.org/jira/browse/SPARK-7706 > Project: Spark > Issue Type: Improvement > Components: Spark Submit > Affects Versions: 1.3.1 > Reporter: Shaik Idris Ali > Labels: oozie, yarn > > Currently in SparkSubmitArguments.scala when master is set to "yarn" > (yarn-cluster mode) > https://github.com/apache/spark/blob/b1f4ca82d170935d15f1fe6beb9af0743b4d81cd/core/src/main/scala/org/apache/spark/deploy/SparkSubmitArguments.scala#L236 > Spark checks if YARN_CONF_DIR or HADOOP_CONF_DIR is set in EVN. > However we should additionally allow passing YARN_CONF_DIR from command line > argument this is particularly handy when Spark is being launched from > schedulers like OOZIE or FALCON. > Reason being, oozie launcher App starts in one of the container assigned by > Yarn RM and we do not want to set YARN_CONF_DIR in ENV for all the nodes in > cluster. Just passing the argument like -yarnconfdir with conf dir (ex: > /etc/hadoop/conf) should avoid setting the ENV variable. > This is blocking us to onboard spark from oozie or falcon. Thanks. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org