[ 
https://issues.apache.org/jira/browse/SPARK-2116?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Matei Zaharia resolved SPARK-2116.
----------------------------------

       Resolution: Fixed
    Fix Version/s: 1.1.0

> Load spark-defaults.conf from directory specified by SPARK_CONF_DIR
> -------------------------------------------------------------------
>
>                 Key: SPARK-2116
>                 URL: https://issues.apache.org/jira/browse/SPARK-2116
>             Project: Spark
>          Issue Type: Improvement
>          Components: Deploy
>    Affects Versions: 1.0.0
>            Reporter: Albert Chu
>            Assignee: Albert Chu
>            Priority: Minor
>             Fix For: 1.1.0
>
>         Attachments: SPARK-2116.patch
>
>
> Presently, spark-defaults.conf is loaded from 
> SPARK_HOME/conf/spark-defaults.conf.  As far as I can tell, the only way to 
> specify an alternate one is to specify one on the command line via 
> spark-submit.
> It would be convenient to have an environment variable to specify a constant 
> alternate spark-defaults.conf.  Via SPARK_CONF_DIR would be convenient, 
> similar to HADOOP_CONF_DIR in Hadoop.
> Patch will be attached, github pull request will also be sent.



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to