[ 
https://issues.apache.org/jira/browse/SPARK-16521?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Saisai Shao closed SPARK-16521.
-------------------------------
    Resolution: Duplicate

> Add support of parameterized configuration for SparkConf
> --------------------------------------------------------
>
>                 Key: SPARK-16521
>                 URL: https://issues.apache.org/jira/browse/SPARK-16521
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Saisai Shao
>            Priority: Minor
>
> Current SparkConf is a key value pair mechanism, in which value is a literal 
> string cannot be changed. In most of the use cases this key value pair system 
> is enough to express the meanings, while in some cases it would be more 
> convenient to make value as a parameterized variable which can be replaceable 
> by other configurations.
> One case is {{spark.sql.warehouse.dir}}, here the default value is is 
> "file:${system:user.dir}/spark-warehouse" in which {{user.dir}} is replaced 
> with system property in the runtime.
> Also several configuration like:
> {code}
> spark.dynamicAllocation.minExecutors 1
> spark.dynamicAllocation.initialExecutors 1
> {code}
> can also be configured as:
> {code}
> spark.dynamicAllocation.minExecutors 1
> spark.dynamicAllocation.initialExecutors 
> ${spark.dynamicAllocation.minExecutors}
> {code}
> So here propose to add parameterized configuration support in SparkConf, this 
> will not change the original semantics of SparkConf, just add a more option 
> to do the configuration.
> This feature is quite useful in our environment, since we have some 
> configurations that are version dependent, it is error prone and tedious to 
> change it when environment is changed.
> Please suggest and comment, thanks a lot.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to