[ 
https://issues.apache.org/jira/browse/SPARK-3913?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14169760#comment-14169760
 ] 

Apache Spark commented on SPARK-3913:
-------------------------------------

User 'chesterxgchen' has created a pull request for this issue:
https://github.com/apache/spark/pull/2786

> Spark Yarn Client API change to expose Yarn Resource Capacity, Yarn 
> Application Listener and killApplication() API
> ------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-3913
>                 URL: https://issues.apache.org/jira/browse/SPARK-3913
>             Project: Spark
>          Issue Type: Improvement
>          Components: YARN
>            Reporter: Chester
>
> When working with Spark with Yarn deployment mode, we have two issues:
> 1) We don't know how much yarn max capacity ( memory and cores) before we 
> specify the number of executor and memories for spark drivers and executors. 
> We we set a big number, the job can potentially exceeds the limit and got 
> killed. 
>    It would be better we let the application know that the yarn resource 
> capacity a head of time and the spark config can adjusted dynamically. 
>   
> 2) Once job started, we would like to have some feedbacks from yarn 
> application. Currently, the spark client basically block the call and returns 
> when the job is finished or failed or killed. 
> If the job runs for few hours, we have no idea how far it has gone, the 
> progress and resource usage, tracking URL etc. 
> 3) Once the job is started, you basically can't stop it. The Yarn Client API 
> stop doesn't to work in most cases from our experience.  But Yarn API does 
> work is killApplication(appId). 
>    So we need to expose this killApplication() API to Spark Yarn Client as 
> well. 
>    
> I will create one Pull Request and try to address these problems.  
>  



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to