[ 
https://issues.apache.org/jira/browse/SPARK-6277?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14356478#comment-14356478
 ] 

Jianshi Huang commented on SPARK-6277:
--------------------------------------

I see. Not relating to hadoop config is fine, how about env variables? It's 
quite often that I want to change one setting for particular tasks, editing 
spark-defaults.conf everytime is inconvenient. Env variables is a best fit here 
because of its dynamic scope.

Typesafe's config has similar features for having env variables and it can even 
allow it override previous settings.

  
https://github.com/typesafehub/config#optional-system-or-env-variable-overrides

Jianshi

> Allow Hadoop configurations and env variables to be referenced in 
> spark-defaults.conf
> -------------------------------------------------------------------------------------
>
>                 Key: SPARK-6277
>                 URL: https://issues.apache.org/jira/browse/SPARK-6277
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 1.3.0, 1.2.1
>            Reporter: Jianshi Huang
>
> I need to set spark.local.dir to use user local home instead of /tmp, but 
> currently spark-defaults.conf can only allow constant values.
> What I want to do is to write:
> bq. spark.local.dir /home/${user.name}/spark/tmp
> or
> bq. spark.local.dir /home/${USER}/spark/tmp
> Otherwise I would have to hack bin/spark-class and pass the option through 
> -Dspark.local.dir
> Jianshi



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to