[ https://issues.apache.org/jira/browse/SPARK-26758?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
ABHISHEK KUMAR GUPTA updated SPARK-26758: ----------------------------------------- Attachment: SPARK-26758.png > Idle Executors are not getting killed after > spark.dynamicAllocation.executorIdleTimeout value > --------------------------------------------------------------------------------------------- > > Key: SPARK-26758 > URL: https://issues.apache.org/jira/browse/SPARK-26758 > Project: Spark > Issue Type: Bug > Components: YARN > Affects Versions: 2.4.0 > Environment: Spark Version:2.4 > Reporter: ABHISHEK KUMAR GUPTA > Priority: Major > Attachments: SPARK-26758.png > > > Steps: > 1. Submit Spark shell with below initial Executor 3, minimum Executor=0 and > executorIdleTimeout=60s > {code} > bin/spark-shell --master yarn --conf spark.dynamicAllocation.enabled=true \ > --conf spark.dynamicAllocation.initialExecutors=3 \ > --conf spark.dynamicAllocation.minExecutors=0 \ > --conf spark.dynamicAllocation.executorIdleTimeout=60s > {code} > 2. Launch Spark UI and check under Executor Tab > Observation: > Initial 3 Executors assigned. After 60s( executorIdleTimeout) , number of > active executor remains same. > Expected: > Apart from AM container, all other executors should be dead. > > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org