Github user andrewor14 commented on a diff in the pull request: https://github.com/apache/spark/pull/2143#discussion_r16754201 --- Diff: core/src/main/scala/org/apache/spark/ContextCleaner.scala --- @@ -76,6 +76,20 @@ private[spark] class ContextCleaner(sc: SparkContext) extends Logging { private val blockOnCleanupTasks = sc.conf.getBoolean( "spark.cleaner.referenceTracking.blocking", true) + /** + * Whether the cleaning thread will block on shuffle cleanup tasks. + * This overrides the global setting `blockOnCleanupTasks` + * + * When context cleaner is configured to block on every delete request, it can throw timeout + * exceptions on cleanup of shuffle blocks, as reported in SPARK-3139. To avoid that, this + * parameter by default disables blocking on shuffle cleanups. Note that this does not affect + * the cleanup of RDDs and broadcasts. This is intended to be a temporary workaround, + * until the real Akka issue (referred to in the comment above `blockOnCleanupTasks`) is + * resolved. + */ + private val blockOnShuffleCleanupTasks = sc.conf.getBoolean( + "spark.cleaner.referenceTracking.blocking.shuffle", false) --- End diff -- So that means `spark.cleaner.referenceTracking.blocking` actually has no effect on the shuffles ever. Then I think it might make sense to separate this out into `spark.cleaner.referenceTracking.blocking.{rdd/shuffle/broadcast}`? It's just a little confusing right now because I would imagine that `spark.cleaner.referenceTracking.blocking` also controls the shuffle behavior if the shuffle-specific config is not set.
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org