[ 
https://issues.apache.org/jira/browse/SPARK-28403?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17016316#comment-17016316
 ] 

Zebing Lin commented on SPARK-28403:
------------------------------------

In our production, this just caused a fluctuation of requested executors:
{code:java}
Total executors: Running = 6, Needed = 6, Requested = 6
Lowering target number of executors to 5 (previously 6) because not all 
requested executors are actually needed
Total executors: Running = 6, Needed = 5, Requested = 6
Total executors: Running = 6, Needed = 6, Requested = 6
Lowering target number of executors to 5 (previously 6) because not all 
requested executors are actually needed
Total executors: Running = 6, Needed = 5, Requested = 6
{code}
I think this logic can be deleted.

> Executor Allocation Manager can add an extra executor when speculative tasks
> ----------------------------------------------------------------------------
>
>                 Key: SPARK-28403
>                 URL: https://issues.apache.org/jira/browse/SPARK-28403
>             Project: Spark
>          Issue Type: Bug
>          Components: Spark Core
>    Affects Versions: 2.3.0
>            Reporter: Thomas Graves
>            Priority: Major
>
> It looks like SPARK-19326 added a bug in the execuctor allocation maanger 
> where it adds an extra executor when it shouldn't when we have pending 
> speculative tasks but the target number didn't change. 
> [https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala#L377]
> It doesn't look like this is necessary since it already added in the 
> pendingSpeculative tasks.
> See the questioning of this on the PR at:
> https://github.com/apache/spark/pull/18492/files#diff-b096353602813e47074ace09a3890d56R379



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to