Marcelo Vanzin created SPARK-22864:
--------------------------------------

             Summary: Flaky test: ExecutorAllocationManagerSuite "cancel 
pending executors when no longer needed"
                 Key: SPARK-22864
                 URL: https://issues.apache.org/jira/browse/SPARK-22864
             Project: Spark
          Issue Type: Bug
          Components: Tests
    Affects Versions: 2.3.0
            Reporter: Marcelo Vanzin


Failed in one PR run:

{noformat}
sbt.ForkMain$ForkError: org.scalatest.exceptions.TestFailedException: 0 did not 
equal -1
        at 
org.scalatest.Assertions$class.newAssertionFailedException(Assertions.scala:528)
        at 
org.scalatest.FunSuite.newAssertionFailedException(FunSuite.scala:1560)
        at 
org.scalatest.Assertions$AssertionsHelper.macroAssert(Assertions.scala:501)
        at 
org.apache.spark.ExecutorAllocationManagerSuite$$anonfun$9.apply(ExecutorAllocationManagerSuite.scala:271)
        at 
org.apache.spark.ExecutorAllocationManagerSuite$$anonfun$9.apply(ExecutorAllocationManagerSuite.scala:247)
{noformat}

Logs have a more interesting exception:

{noformat}
17/12/21 04:26:10.846 
pool-1-thread-1-ScalaTest-running-ExecutorAllocationManagerSuite INFO 
ExecutorAllocationManager: Requesting 2 new executors because tasks are 
backlogged (new desired total will be 5)
17/12/21 04:26:10.858 spark-listener-group-appStatus ERROR AsyncEventQueue: 
Listener AppStatusListener threw an exception
java.lang.UnsupportedOperationException: duration() called on unfinished task
        at org.apache.spark.scheduler.TaskInfo.duration(TaskInfo.scala:118)
        at 
org.apache.spark.status.AppStatusListener$$anonfun$onTaskEnd$1.apply(AppStatusListener.scala:468)
        at 
org.apache.spark.status.AppStatusListener$$anonfun$onTaskEnd$1.apply(AppStatusListener.scala:435)
        at scala.Option.foreach(Option.scala:257)
{noformat}

Seems like a race since it passed several attempts locally.



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to