Do you get this failure repeatedly?


On Thu, May 14, 2015 at 12:55 AM, kf <wangf...@huawei.com> wrote:

> Hi, all, i got following error when i run unit test of spark by
> dev/run-tests
> on the latest "branch-1.4" branch.
>
> the latest commit id:
> commit d518c0369fa412567855980c3f0f426cde5c190d
> Author: zsxwing <zsxw...@gmail.com>
> Date:   Wed May 13 17:58:29 2015 -0700
>
> error
>
> [info] Test org.apache.spark.streaming.JavaAPISuite.testCount started
> [error] Test org.apache.spark.streaming.JavaAPISuite.testCount failed:
> org.apache.spark.SparkException: Error communicating with MapOutputTracker
> [error]     at
> org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:113)
> [error]     at
> org.apache.spark.MapOutputTracker.sendTracker(MapOutputTracker.scala:119)
> [error]     at
> org.apache.spark.MapOutputTrackerMaster.stop(MapOutputTracker.scala:324)
> [error]     at org.apache.spark.SparkEnv.stop(SparkEnv.scala:93)
> [error]     at org.apache.spark.SparkContext.stop(SparkContext.scala:1577)
> [error]     at
>
> org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:626)
> [error]     at
>
> org.apache.spark.streaming.StreamingContext.stop(StreamingContext.scala:597)
> [error]     at
>
> org.apache.spark.streaming.TestSuiteBase$class.runStreamsWithPartitions(TestSuiteBase.scala:403)
> [error]     at
>
> org.apache.spark.streaming.JavaTestUtils$.runStreamsWithPartitions(JavaTestUtils.scala:102)
> [error]     at
>
> org.apache.spark.streaming.TestSuiteBase$class.runStreams(TestSuiteBase.scala:344)
> [error]     at
>
> org.apache.spark.streaming.JavaTestUtils$.runStreams(JavaTestUtils.scala:102)
> [error]     at
>
> org.apache.spark.streaming.JavaTestBase$class.runStreams(JavaTestUtils.scala:74)
> [error]     at
>
> org.apache.spark.streaming.JavaTestUtils$.runStreams(JavaTestUtils.scala:102)
> [error]     at
> org.apache.spark.streaming.JavaTestUtils.runStreams(JavaTestUtils.scala)
> [error]     at
> org.apache.spark.streaming.JavaAPISuite.testCount(JavaAPISuite.java:103)
> [error]     ...
> [error] Caused by: org.apache.spark.SparkException: Error sending message
> [message = StopMapOutputTracker]
> [error]     at
> org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:116)
> [error]     at
> org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:78)
> [error]     at
> org.apache.spark.MapOutputTracker.askTracker(MapOutputTracker.scala:109)
> [error]     ... 52 more
> [error] Caused by: java.util.concurrent.TimeoutException: Futures timed out
> after [120 seconds]
> [error]     at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
> [error]     at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
> [error]     at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
> [error]     at
>
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> [error]     at scala.concurrent.Await$.result(package.scala:107)
> [error]     at
> org.apache.spark.rpc.RpcEndpointRef.askWithRetry(RpcEndpointRef.scala:102)
> [error]     ... 54 more
>
>
>
> --
> View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/Unit-Test-Failure-Test-org-apache-spark-streaming-JavaAPISuite-testCount-failed-tp22879.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

Reply via email to