[ 
https://issues.apache.org/jira/browse/SPARK-2641?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14079114#comment-14079114
 ] 

Apache Spark commented on SPARK-2641:
-------------------------------------

User 'kjsingh' has created a pull request for this issue:
https://github.com/apache/spark/pull/1657

> Spark submit doesn't pick up executor instances from properties file
> --------------------------------------------------------------------
>
>                 Key: SPARK-2641
>                 URL: https://issues.apache.org/jira/browse/SPARK-2641
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 1.0.0
>            Reporter: Kanwaljit Singh
>
> When running spark-submit in Yarn cluster mode, we provide properties file 
> using --properties-file option.
> spark.executor.instances=5
> spark.executor.memory=2120m
> spark.executor.cores=3
> The submitted job picks up the cores and memory, but not the correct 
> instances.
> I think the issue is here in org.apache.spark.deploy.SparkSubmitArguments:
> // Use properties file as fallback for values which have a direct analog to
>     // arguments in this script.
>     master = 
> Option(master).getOrElse(defaultProperties.get("spark.master").orNull)
>     executorMemory = Option(executorMemory)
>       .getOrElse(defaultProperties.get("spark.executor.memory").orNull)
>     executorCores = Option(executorCores)
>       .getOrElse(defaultProperties.get("spark.executor.cores").orNull)
>     totalExecutorCores = Option(totalExecutorCores)
>       .getOrElse(defaultProperties.get("spark.cores.max").orNull)
>     name = 
> Option(name).getOrElse(defaultProperties.get("spark.app.name").orNull)
>     jars = Option(jars).getOrElse(defaultProperties.get("spark.jars").orNull)
> Along with these defaults, we should also set default for instances:
> numExecutors=Option(numExecutors).getOrElse(defaultProperties.get("spark.executor.instances").orNull)
> PS: spark.executor.instances is also not mentioned on 
> http://spark.apache.org/docs/latest/configuration.html



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to