[ https://issues.apache.org/jira/browse/SPARK-26758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sean Owen resolved SPARK-26758. ------------------------------- Resolution: Fixed Fix Version/s: 2.3.4 2.4.1 3.0.0 Issue resolved by pull request 23697 [https://github.com/apache/spark/pull/23697] > Idle Executors are not getting killed after > spark.dynamicAllocation.executorIdleTimeout value > --------------------------------------------------------------------------------------------- > > Key: SPARK-26758 > URL: https://issues.apache.org/jira/browse/SPARK-26758 > Project: Spark > Issue Type: Bug > Components: YARN > Affects Versions: 2.4.0 > Environment: Spark Version:2.4 > Reporter: ABHISHEK KUMAR GUPTA > Assignee: sandeep katta > Priority: Major > Fix For: 3.0.0, 2.4.1, 2.3.4 > > Attachments: SPARK-26758.png > > > Steps: > 1. Submit Spark shell with below initial Executor 3, minimum Executor=0 and > executorIdleTimeout=60s > {code} > bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true \ > --conf spark.dynamicAllocation.initialExecutors=3 \ > --conf spark.dynamicAllocation.minExecutors=0 \ > --conf spark.dynamicAllocation.executorIdleTimeout=60s > {code} > 2. Launch Spark UI and check under Executor Tab > Observation: > Initial 3 Executors assigned. After 60s( executorIdleTimeout) , number of > active executor remains same. > Expected: > Apart from AM container, all other executors should be dead. > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org