GitHub user sitalkedia opened a pull request:

    https://github.com/apache/spark/pull/19534

    [SPARK-22312][CORE] Fix bug in Executor allocation manager in running…

    ## What changes were proposed in this pull request?
    
    We often see the issue of Spark jobs stuck because the Executor Allocation 
Manager does not ask for any executor even if there are pending tasks in case 
dynamic allocation is turned on. Looking at the logic in Executor Allocation 
Manager, which calculates the running tasks, it can happen that the calculation 
will be wrong and the number of running tasks can become negative.
    
    
    ## How was this patch tested?
    
    Added unit test


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/sitalkedia/spark skedia/fix_stuck_job

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19534.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19534
    
----
commit 4f0cffa42c828d3f49e983dae8b2188b78036fcc
Author: Sital Kedia <ske...@fb.com>
Date:   2017-10-19T05:24:38Z

    [SPARK-22312][CORE] Fix bug in Executor allocation manager in running tasks 
calculation

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to