Ngone51 commented on a change in pull request #28476: URL: https://github.com/apache/spark/pull/28476#discussion_r421973198
########## File path: core/src/main/scala/org/apache/spark/scheduler/TaskSchedulerImpl.scala ########## @@ -688,10 +688,11 @@ private[spark] class TaskSchedulerImpl( val errorMsg = s"Fail resource offers for barrier stage ${taskSet.stageId} because only " + s"${addressesWithDescs.size} out of a total number of ${taskSet.numTasks}" + - s" tasks got resource offers. This happens because barrier execution currently " + - s"does not work gracefully with delay scheduling. We highly recommend you to " + - s"disable delay scheduling by setting spark.locality.wait=0 as a workaround if " + - s"you see this error frequently." + s" tasks got resource offers. This could happen if delay scheduling or " + + s"blacklisting is enabled, as barrier execution currently does not work " + + s"gracefully with them. We highly recommend you to disable delay scheduling " + + s"by setting spark.locality.wait=0 or disable blacklisting by setting " + Review comment: Thanks for review. I realized a fact that blacklisting actually does not work for barrier taskset. As you may know, blacklisting only takes effect when there's failed task. But for a barrier task set, it will be marked as zombie once there's any failed task and we don't consider blacklisting for a zombie task set, see: https://github.com/apache/spark/blob/8b4862953a879a9b3ba6f57e669efc383df68b7c/core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala#L885-L894 So, I think blacklisting actually won't cause partial tasks launching. I will close this PR. ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org