Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/9127#discussion_r42898147
  
    --- Diff: core/src/main/scala/org/apache/spark/memory/MemoryManager.scala 
---
    @@ -102,9 +113,88 @@ private[spark] abstract class MemoryManager extends 
Logging {
       }
     
       /**
    -   * Release N bytes of execution memory.
    +   * Acquire N bytes of memory for execution, evicting cached blocks if 
necessary.
    +   * Blocks evicted in the process, if any, are added to `evictedBlocks`.
    +   * @return number of bytes successfully granted (<= N).
    +   */
    +  @VisibleForTesting
    +  private[memory] def doAcquireExecutionMemory(
    +      numBytes: Long,
    +      evictedBlocks: mutable.Buffer[(BlockId, BlockStatus)]): Long
    +
    +  /**
    +   * Try to acquire up to `numBytes` of execution memory for the current 
task and return the number
    +   * of bytes obtained, or 0 if none can be allocated.
    +   *
    +   * This call may block until there is enough free memory in some 
situations, to make sure each
    +   * task has a chance to ramp up to at least 1 / 2N of the total memory 
pool (where N is the # of
    +   * active tasks) before it is forced to spill. This can happen if the 
number of tasks increase
    +   * but an older task had a lot of memory already.
        */
    -  def releaseExecutionMemory(numBytes: Long): Unit = synchronized {
    +  def acquireExecutionMemory(numBytes: Long, taskAttemptId: Long): Long = 
synchronized {
    --- End diff --
    
    `final def`?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to