[ https://issues.apache.org/jira/browse/SPARK-34345?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17307644#comment-17307644 ]
Arseniy Tashoyan commented on SPARK-34345: ------------------------------------------ Typesafe Config seems attractive, as it provides the following features: * _include_ directives * environment variable injections This is convenient when running an application in Kubernetes and providing the configuration in ConfigMaps. As an alternative, a Scala wrapper like [Pureconfig|https://pureconfig.github.io] could be used. Implementation details: * Deliver within Spark libraries reference files _reference.conf_ containing default (reference) values for all Spark settings. ** Different Spark libraries may have their respective files _reference.conf_. For example, _spark-sql/reference.conf_ contains settings specific to Spark SQL and so on. * Use Config API to get values ** Cleanup default values, coded in Scala or Java * Introduce new command-line argument for spark-submit: _--config-file_ ** When both _--config-file_ and _--properties-file_ specified, ignore the latter and print a warning ** When only _--properties-file_ specified, use the legacy way and print a deprecation warning. > Allow several properties files > ------------------------------ > > Key: SPARK-34345 > URL: https://issues.apache.org/jira/browse/SPARK-34345 > Project: Spark > Issue Type: Improvement > Components: Spark Submit > Affects Versions: 3.0.1, 3.1.1 > Reporter: Arseniy Tashoyan > Priority: Major > > Example: we have 2 applications A and B. These applications have some common > Spark settings and some application-specific settings. The idea is to run > them like this: > {code:bash} > spark-submit --properties-files common.properties,a.properties A > spark-submit --properties-files common.properties,b.properties B > {code} > Benefits: > - Common settings can be extracted to a common file _common.properties_, no > need to copy them over _a.properties_ and _b.properties_ > - Applications can override common settings in their respective custom > properties files > Currently the following mechanism works in SparkSubmitArguments.scala: > console arguments like _--conf key=value_ overwrite settings in the > properties file. This is not enough, because console arguments should be > specified in the launcher script; de-facto they belong to the binary > distribution rather than the configuration. > Consider the following scenario: Spark on Kubernetes, the configuration is > provided as a ConfigMap. We could have the following ConfigMaps: > - _a.properties_ // mount to the Pod with application A > - _b.properties_ // mount to the Pod with application B > - _common.properties_ // mount to both Pods with A and B > Meanwhile the launcher script _app-submit.sh_ is the same for both > applications A and B, since it contains none configuration settings: > {code:bash} > spark-submit --properties-files common.properties,${app_name}.properties ... > {code} > *Alternate solution* > Use Typesafe Config for Spark settings instead of properties files. Typesafe > Config allows including files. > For example, settings for the application A - _a.conf_: > {code:yaml} > include required("common.conf") > spark.sql.shuffle.partitions = 240 > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org