[ https://issues.apache.org/jira/browse/SPARK-34166?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-34166: ---------------------------------- Issue Type: Test (was: Improvement) > Fix flaky test in DecommissionWorkerSuite > ----------------------------------------- > > Key: SPARK-34166 > URL: https://issues.apache.org/jira/browse/SPARK-34166 > Project: Spark > Issue Type: Test > Components: Spark Core, Tests > Affects Versions: 3.1.0, 3.2.0, 3.1.1 > Reporter: ulysses you > Priority: Minor > > The test `decommission workers ensure that shuffle output is regenerated even > with shuffle service` assumes it has two executor and both of two tasks can > execute concurrently. > The two tasks will execute serially if there only one executor. The result is > test is unexpceted. E.g. > ``` > [info] 5 did not equal 4 Expected 4 tasks but got List(0:0:0:0-SUCCESS, > 0:0:1:0-FAILED, 0:0:1:1-SUCCESS, 0:1:0:0-SUCCESS, 1:0:0:0-SUCCESS) > (DecommissionWorkerSuite.scala:190) > ``` -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org