[ https://issues.apache.org/jira/browse/SPARK-20210?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15956436#comment-15956436 ]
Kazuaki Ishizaki commented on SPARK-20210: ------------------------------------------ I run the following two tests (DatasetCacheSuite and CachedTableSuite) with Ubuntu 16.04 / java 1.8.0_111. However, I did not see this exception. {noformat} % build/sbt "sql/test-only *sql.DatasetCacheSuite" ... [info] ScalaTest [info] Run completed in 10 seconds, 806 milliseconds. [info] Total number of tests run: 4 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 4, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 4, Failed 0, Errors 0, Passed 4 [success] Total time: 637 s, completed Apr 5, 2017 4:00:31 PM % build/sbt "sql/test-only *sql.CachedTableSuite" ... [info] ScalaTest [info] Run completed in 24 seconds, 214 milliseconds. [info] Total number of tests run: 30 [info] Suites: completed 1, aborted 0 [info] Tests: succeeded 30, failed 0, canceled 0, ignored 0, pending 0 [info] All tests passed. [info] Passed: Total 30, Failed 0, Errors 0, Passed 30 [success] Total time: 44 s, completed Apr 5, 2017 4:10:27 PM {noformat} {noformat} $ cat /etc/os-release NAME="Ubuntu" VERSION="16.04.1 LTS (Xenial Xerus)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 16.04.1 LTS" VERSION_ID="16.04" HOME_URL="http://www.ubuntu.com/" SUPPORT_URL="http://help.ubuntu.com/" BUG_REPORT_URL="http://bugs.launchpad.net/ubuntu/" VERSION_CODENAME=xenial UBUNTU_CODENAME=xenial $ java -version openjdk version "1.8.0_111" OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-2ubuntu0.16.04.2-b14) OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode) {noformat} > Scala tests aborted in Spark SQL on ppc64le > ------------------------------------------- > > Key: SPARK-20210 > URL: https://issues.apache.org/jira/browse/SPARK-20210 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 2.2.0 > Environment: Ubuntu 14.04 ppc64le > $ java -version > openjdk version "1.8.0_111" > OpenJDK Runtime Environment (build 1.8.0_111-8u111-b14-3~14.04.1-b14) > OpenJDK 64-Bit Server VM (build 25.111-b14, mixed mode) > Reporter: Sonia Garudi > Priority: Minor > Labels: ppc64le > > The tests get aborted with the following error : > {code} > [31m*** RUN ABORTED ***[0m > [31m org.apache.spark.rpc.RpcTimeoutException: Futures timed out after [120 > seconds]. This timeout is controlled by spark.rpc.askTimeout[0m > [31m at > org.apache.spark.rpc.RpcTimeout.org$apache$spark$rpc$RpcTimeout$$createRpcTimeoutException(RpcTimeout.scala:47)[0m > [31m at > org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:62)[0m > [31m at > org.apache.spark.rpc.RpcTimeout$$anonfun$addMessageIfTimeout$1.applyOrElse(RpcTimeout.scala:58)[0m > [31m at > scala.runtime.AbstractPartialFunction.apply(AbstractPartialFunction.scala:36)[0m > [31m at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:76)[0m > [31m at > org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)[0m > [31m at > org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)[0m > [31m at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)[0m > [31m at > org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)[0m > [31m at > org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)[0m > [31m ...[0m > [31m Cause: java.util.concurrent.TimeoutException: Futures timed out after > [120 seconds][0m > [31m at > scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)[0m > [31m at > scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)[0m > [31m at > org.apache.spark.util.ThreadUtils$.awaitResult(ThreadUtils.scala:201)[0m > [31m at org.apache.spark.rpc.RpcTimeout.awaitResult(RpcTimeout.scala:75)[0m > [31m at > org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:125)[0m > [31m at > org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:1792)[0m > [31m at org.apache.spark.rdd.RDD.unpersist(RDD.scala:216)[0m > [31m at > org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)[0m > [31m at > org.apache.spark.sql.execution.CacheManager$$anonfun$clearCache$1$$anonfun$apply$mcV$sp$1.apply(CacheManager.scala:75)[0m > [31m at scala.collection.Iterator$class.foreach(Iterator.scala:893)[0m > [31m ...[0m > {code} -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org