[ 
https://issues.apache.org/jira/browse/SPARK-21023?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16043160#comment-16043160
 ] 

Lantao Jin edited comment on SPARK-21023 at 6/8/17 6:06 PM:
------------------------------------------------------------

The purpose is making the default configuration loaded anytime. Because the 
parameters app developer set always less the it should be.
For example: App dev set spark.executor.instances=100 in their properties file. 
But one month later the spark version upgrade to a new version by infra team 
and dynamic resource allocation enabled. But the old job can not load the new 
parameters so no dynamic feature enable for it. It still causes more challenge 
to control cluster for infra team and bad performance for app team.


was (Author: cltlfcjin):
*The purpose is making the default configuration loaded anytime.* Because the 
parameters app developer set always less the it should be.
For example: App dev set spark.executor.instances=100 in their properties file. 
But one month later the spark version upgrade to a new version by infra team 
and dynamic resource allocation enabled. But the old job can not load the new 
parameters so no dynamic feature enable for it. It still causes more challenge 
to control cluster for infra team and bad performance for app team.

> Ignore to load default properties file is not a good choice from the 
> perspective of system
> ------------------------------------------------------------------------------------------
>
>                 Key: SPARK-21023
>                 URL: https://issues.apache.org/jira/browse/SPARK-21023
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Submit
>    Affects Versions: 2.1.1
>            Reporter: Lantao Jin
>            Priority: Minor
>
> The default properties file {{spark-defaults.conf}} shouldn't be ignore to 
> load even though the submit arg {{--properties-file}} is set. The reasons are 
> very easy to see:
> * Infrastructure team need continually update the {{spark-defaults.conf}} 
> when they want set something as default for entire cluster as a tuning 
> purpose.
> * Application developer only want to override the parameters they really want 
> rather than others they even doesn't know (Set by infrastructure team).
> * The purpose of using {{\-\-properties-file}} from most of application 
> developers is to avoid setting dozens of {{--conf k=v}}. But if 
> {{spark-defaults.conf}} is ignored, the behaviour becomes unexpected finally.
> All this caused by below codes:
> {code}
>   private Properties loadPropertiesFile() throws IOException {
>     Properties props = new Properties();
>     File propsFile;
>     if (propertiesFile != null) {
>     // default conf property file will not be loaded when app developer use 
> --properties-file as a submit args
>       propsFile = new File(propertiesFile);
>       checkArgument(propsFile.isFile(), "Invalid properties file '%s'.", 
> propertiesFile);
>     } else {
>       propsFile = new File(getConfDir(), DEFAULT_PROPERTIES_FILE);
>     }
>     //...
>     return props;
>   }
> {code}
> I can offer a patch to fix it if you think it make sense.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to