Github user andrewor14 commented on a diff in the pull request:

    https://github.com/apache/spark/pull/2143#discussion_r16756688
  
    --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala ---
    @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) 
extends Logging {
       private val blockOnCleanupTasks = sc.conf.getBoolean(
         "spark.cleaner.referenceTracking.blocking", true)
     
    +  /**
    +   * Whether the cleaning thread will block on shuffle cleanup tasks.
    +   * This overrides the global setting `blockOnCleanupTasks`
    +   *
    +   * When context cleaner is configured to block on every delete request, 
it can throw timeout
    +   * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. 
To avoid that, this
    +   * parameter by default disables blocking on shuffle cleanups. Note that 
this does not affect
    +   * the cleanup of RDDs and broadcasts. This is intended to be a 
temporary workaround,
    +   * until the real Akka issue (referred to in the comment above 
`blockOnCleanupTasks`) is
    +   * resolved.
    +   */
    +  private val blockOnShuffleCleanupTasks = sc.conf.getBoolean(
    +    "spark.cleaner.referenceTracking.blocking.shuffle", false)
    --- End diff --
    
    I'm fine with the changes as is. This is not a huge deal since we don't 
expose it.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to