What is the data size? Have you tried increasing the driver memory??

Thanks
Best Regards

On Sat, Jan 17, 2015 at 1:01 PM, Kevin (Sangwoo) Kim <kevin...@apache.org>
wrote:

> Hi experts,
> I got an error during unpersist RDD.
> Any ideas?
>
> java.util.concurrent.TimeoutException: Futures timed out after [30
> seconds] at
> scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219) at
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223) at
> scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107) at
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
> at scala.concurrent.Await$.result(package.scala:107) at
> org.apache.spark.storage.BlockManagerMaster.removeRdd(BlockManagerMaster.scala:103)
> at org.apache.spark.SparkContext.unpersistRDD(SparkContext.scala:951) at
> org.apache.spark.rdd.RDD.unpersist(RDD.scala:168)
>
>

Reply via email to