GitHub user shahidki31 opened a pull request:

    https://github.com/apache/spark/pull/22526

    [SPARK-25502]Empty Page when page number exceeds the reatinedTask size.

    ## What changes were proposed in this pull request?
    Test steps :    
    1)  bin/spark-shell --conf spark.ui.retainedTasks=200
    2) val rdd = sc.parallelize(1 to 1000, 1000)
    3) rdd.count
    
    Stage tab in the UI will display 10 pages with 100 tasks per page. But 
number of retained tasks in only 200. So, from the 3rd page onwards will 
display nothing. 
     We have to calculate total pages based on the number of tasks need display 
in the UI. 
    
    **Before the change:**
    
![empty_4](https://user-images.githubusercontent.com/23054875/45918251-b1650580-bea1-11e8-90d3-7e0d491981a2.jpg)
    
    **After the change:**
    
![empty_3](https://user-images.githubusercontent.com/23054875/45918257-c2ae1200-bea1-11e8-960f-dfbdb4a90ae7.jpg)
    
    
    
    ## How was this patch tested?
    
    Manually tested
    
    Please review http://spark.apache.org/contributing.html before opening a 
pull request.


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/shahidki31/spark SPARK-25502

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/22526.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #22526
    
----
commit 6204cbe46b99cc6d897dbcebec81e89b369d58d2
Author: Shahid <shahidki31@...>
Date:   2018-09-22T14:07:22Z

    [SPARK-25502]Empty Page when page number exceeds the reatinedTask size.

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to