WeichenXu123 commented on a change in pull request #28596: URL: https://github.com/apache/spark/pull/28596#discussion_r429020982
########## File path: core/src/test/scala/org/apache/spark/scheduler/BarrierTaskContextSuite.scala ########## @@ -69,12 +69,12 @@ class BarrierTaskContextSuite extends SparkFunSuite with LocalSparkContext with // Pass partitionId message in val message: String = context.partitionId().toString val messages: Array[String] = context.allGather(message) - messages.toList.iterator + Iterator.single(messages.toList) } // Take a sorted list of all the partitionId messages val messages = rdd2.collect().head // All the task partitionIds are shared - for((x, i) <- messages.view.zipWithIndex) assert(x.toString == i.toString) + assert(messages === List("0", "1", "2", "3")) Review comment: We'd better verify the whole rdd collect array, not only the head of result array? ---------------------------------------------------------------- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. For queries about this service, please contact Infrastructure at: us...@infra.apache.org --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org