[ 
https://issues.apache.org/jira/browse/SPARK-27003?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen resolved SPARK-27003.
-------------------------------
    Resolution: Not A Problem

> [Spark]Message display at console is not correct for spark.executor.instances
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-27003
>                 URL: https://issues.apache.org/jira/browse/SPARK-27003
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.4.0
>         Environment: {color:red}colored text{color}
>            Reporter: ABHISHEK KUMAR GUPTA
>            Priority: Minor
>         Attachments: Spark-JIRA-Executor-Msg.png, Spark-JIRA-Executor-Msg.png
>
>
> Steps:
>  spark.executor.instances{color:#ff0000} not {color}configured at client side 
> spark-default.conf( Default value is 2 )
>  bin/spark-shell --master yarn --executor-memory 1024m --executor-cores 1 
> --conf spark.dynamicAllocation.enabled=true --conf 
> spark.dynamicAllocation.initialExecutors=3 --conf 
> {color:#ff0000}spark.dynamicAllocation.minExecutors=1{color} --conf 
> spark.dynamicAllocation.executorIdleTimeout=60s --conf spark.authenticate=true
> Warning msg display as below
> !Spark-JIRA-Executor-Msg.png!
> {color:#ff0000}spark.executor.instances less than 
> spark.dynamicAllocation.minExecutors in Invalid, ignoring its setting, please 
> update your config.{color}
>  
> Note:
> When I submit as below with 
> {color:#ff0000}spark.dynamicAllocation.minExecutors=0{color} no warning 
> message at console
>  
> bin/spark-shell  --master yarn  --executor-memory 1024m --executor-cores 1 
> --conf spark.dynamicAllocation.enabled=true --conf 
> spark.dynamicAllocation.initialExecutors=3 --conf 
> spark.dynamicAllocation.minExecutors=0 --conf 
> spark.dynamicAllocation.executorIdleTimeout=60s --conf spark.authenticate=true
>  



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to