Github user sitalkedia commented on a diff in the pull request:

    https://github.com/apache/spark/pull/19580#discussion_r147328834
  
    --- Diff: 
core/src/main/scala/org/apache/spark/ExecutorAllocationManager.scala ---
    @@ -267,6 +267,10 @@ private[spark] class ExecutorAllocationManager(
         (numRunningOrPendingTasks + tasksPerExecutor - 1) / tasksPerExecutor
       }
     
    +  private def totalRunningTasks(): Int = synchronized {
    --- End diff --
    
    Its okay to add a method which is used for unit testing purpose only. I am 
not inclined towards the idea of using `maxNumExecutorsNeeded` to indirectly 
verify `totalRunningTasks` for the following reason - 
    
    Currently, the test case is testing what it is supposed to. If you check 
for `maxNumExecutorsNeeded` instead, it might not be clear what we are testing. 
    
    



---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to