wuyi created SPARK-31379: ---------------------------- Summary: Fix flaky test: o.a.s.scheduler.CoarseGrainedSchedulerBackendSuite.extra resources from executor Key: SPARK-31379 URL: https://issues.apache.org/jira/browse/SPARK-31379 Project: Spark Issue Type: Test Components: Spark Core Affects Versions: 3.0.0 Reporter: wuyi
see [https://amplab.cs.berkeley.edu/jenkins/job/SparkPullRequestBuilder/120786/testReport/org.apache.spark.scheduler/CoarseGrainedSchedulerBackendSuite/extra_resources_from_executor/] for details. {code:java} sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedDueToTimeoutException: The code passed to eventually never returned normally. Attempted 325 times over 5.01070979 seconds. Last failure message: ArrayBuffer("1", "3") did not equal Array("0", "1", "3"). at org.scalatest.concurrent.Eventually.tryTryAgain$1(Eventually.scala:432) at org.scalatest.concurrent.Eventually.eventually(Eventually.scala:439) at org.scalatest.concurrent.Eventually.eventually$(Eventually.scala:391) at org.apache.spark.scheduler.CoarseGrainedSchedulerBackendSuite.eventually(CoarseGrainedSchedulerBackendSuite.scala:45) at org.scalatest.concurrent.Eventually.eventually(Eventually.scala:337) at org.scalatest.concurrent.Eventually.eventually$(Eventually.scala:336) at org.apache.spark.scheduler.CoarseGrainedSchedulerBackendSuite.eventually(CoarseGrainedSchedulerBackendSuite.scala:45) at org.apache.spark.scheduler.CoarseGrainedSchedulerBackendSuite.$anonfun$new$12(CoarseGrainedSchedulerBackendSuite.scala:264) at org.scalatest.OutcomeOf.outcomeOf(OutcomeOf.scala:85) at org.scalatest.OutcomeOf.outcomeOf$(OutcomeOf.scala:83) at org.scalatest.OutcomeOf$.outcomeOf(OutcomeOf.scala:104) at org.scalatest.Transformer.apply(Transformer.scala:22) at org.scalatest.Transformer.apply(Transformer.scala:20) {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org