[ https://issues.apache.org/jira/browse/SPARK-27228?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16798841#comment-16798841 ]
Lukas Waldmann commented on SPARK-27228: ---------------------------------------- Executors management seems to behave strangely. After calling spark.stop() See this: {quote}19/03/21 09:51:39 INFO YarnSchedulerBackend$YarnDriverEndpoint: Disabling executor 332. 19/03/21 09:51:39 INFO DAGScheduler: Executor lost: 332 (epoch 446) 19/03/21 09:51:39 INFO BlockManagerMasterEndpoint: Trying to remove executor 332 from BlockManagerMaster. 19/03/21 09:51:39 INFO BlockManagerMasterEndpoint: Removing block manager BlockManagerId(332, data-10.bdp.gin.merck.com, 38713, None) 19/03/21 09:51:39 INFO BlockManagerMaster: Removed 332 successfully in removeExecutor {quote} and few minutes later: {quote}19/03/21 09:54:26 WARN HeartbeatReceiver: Removing executor 332 with no recent heartbeats: 173942 ms exceeds timeout 120000 ms 19/03/21 09:54:26 ERROR YarnClusterScheduler: Lost an executor 332 (already removed): Executor heartbeat timed out after 173942 ms 19/03/21 09:54:26 INFO YarnClusterSchedulerBackend: Requesting to kill executor(s) 332 19/03/21 09:54:26 WARN YarnClusterSchedulerBackend: Executor to kill 332 does not exist! 19/03/21 09:54:26 INFO YarnClusterSchedulerBackend: Actual list of executor(s) to be killed is{quote} > Spark long delay on close, possible problem with killing executors > ------------------------------------------------------------------ > > Key: SPARK-27228 > URL: https://issues.apache.org/jira/browse/SPARK-27228 > Project: Spark > Issue Type: Bug > Components: Block Manager > Affects Versions: 2.3.0 > Reporter: Lukas Waldmann > Priority: Major > Attachments: log.html > > > When using dynamic allocations after all jobs finishes spark delays for > several minutes before finally finishes. Log suggest that executors are not > cleared up properly. > See the attachment for log > -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org