[ https://issues.apache.org/jira/browse/SPARK-9497?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14648721#comment-14648721 ]
Kay Ousterhout commented on SPARK-9497: --------------------------------------- When did you start seeing these failures? I added the test 1.5 years ago, so it seems likely to be related to a recent change in the code. It looks like this error is happening when we try to stop the AppClient from SparkDeploySchedulerBackend: https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/scheduler/cluster/SparkDeploySchedulerBackend.scala#L96; is it possible this started happening after [~zsxwing]'s recent change? https://github.com/apache/spark/commit/3bee0f1466ddd69f26e95297b5e0d2398b6c6268 (that changed the line of code in AppClient.scala that is failing: https://github.com/apache/spark/blame/master/core/src/main/scala/org/apache/spark/deploy/client/AppClient.scala#L251) > Flaky test: DistributedSuite failed after the test of "repeatedly failing > task that crashes JVM" > ------------------------------------------------------------------------------------------------ > > Key: SPARK-9497 > URL: https://issues.apache.org/jira/browse/SPARK-9497 > Project: Spark > Issue Type: Bug > Components: Tests > Reporter: Yin Huai > Labels: flaky-test > > Seems it is pretty often to see DistributedSuite failed right after > "repeatedly failing task that crashes JVM". > One example jenkins can be found at > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3117/AMPLAB_JENKINS_BUILD_PROFILE=hadoop1.0,label=centos/testReport/junit/org.apache.spark/DistributedSuite/ > The log of it can be found at > https://amplab.cs.berkeley.edu/jenkins/job/Spark-Master-SBT/3117/AMPLAB_JENKINS_BUILD_PROFILE=hadoop1.0,label=centos/artifact/core/target/unit-tests.log > (search StopAppClient). -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org