[ 
https://issues.apache.org/jira/browse/SPARK-46006?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

angerszhu updated SPARK-46006:
------------------------------
    Description: 
We meet a case that user call sc.stop() after run all custom code, but stuck in 
some place. 

Cause below situation
 # User call sc.stop()
 # sc.stop() stuck in some process, but SchedulerBackend.stop was called
 # Since tarn ApplicationMaster didn't finish, still call 
YarnAllocator.allocateResources()
因为driver端口已经关闭,allocateResource还在继续申请新的executor失败
触发 Max number of executor failures

  was:
We meet a case that user call sc.stop() after run all custom code, but stuck in 
some place. 

Cause below situation
 # User call sc.stop()
 # sc.stop() stuck in some process, but SchedulerBackend.stop was called
ApplicationMaster 还没有进入finish, 还会继续调用YarnAllocator.allocateResources()
因为driver端口已经关闭,allocateResource还在继续申请新的executor失败
触发 Max number of executor failures


> YarnAllocator miss clean targetNumExecutorsPerResourceProfileId after 
> YarnSchedulerBackend call stop
> ----------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-46006
>                 URL: https://issues.apache.org/jira/browse/SPARK-46006
>             Project: Spark
>          Issue Type: Bug
>          Components: YARN
>    Affects Versions: 3.1.3, 3.2.4, 3.3.2, 3.4.1, 3.5.0
>            Reporter: angerszhu
>            Priority: Major
>             Fix For: 3.4.2, 4.0.0, 3.5.1
>
>
> We meet a case that user call sc.stop() after run all custom code, but stuck 
> in some place. 
> Cause below situation
>  # User call sc.stop()
>  # sc.stop() stuck in some process, but SchedulerBackend.stop was called
>  # Since tarn ApplicationMaster didn't finish, still call 
> YarnAllocator.allocateResources()
> 因为driver端口已经关闭,allocateResource还在继续申请新的executor失败
> 触发 Max number of executor failures



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to