I'm not sure what the expected performance should be for this amount of
data, but you could try to increase the timeout with the property
"spark.akka.timeout" to see if that helps.

Bryan

On Sun, Apr 26, 2015 at 6:57 AM, Deepak Gopalakrishnan <dgk...@gmail.com>
wrote:

> Hello All,
>
> I'm trying to process a 3.5GB file on standalone mode using spark. I could
> run my spark job succesfully on a 100MB file and it works as expected. But,
> when I try to run it on the 3.5GB file, I run into the below error :
>
>
> 15/04/26 12:45:50 INFO BlockManagerMaster: Updated info of block taskresult_83
> 15/04/26 12:46:46 WARN AkkaUtils: Error sending message [message = 
> Heartbeat(2,[Lscala.Tuple2;@790223d3,BlockManagerId(2, master.spark.com, 
> 39143))] in 1 attempts
> java.util.concurrent.TimeoutException: Futures timed out after [30 seconds]
>       at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:219)
>       at 
> scala.concurrent.impl.Promise$DefaultPromise.result(Promise.scala:223)
>       at scala.concurrent.Await$$anonfun$result$1.apply(package.scala:107)
>       at 
> scala.concurrent.BlockContext$DefaultBlockContext$.blockOn(BlockContext.scala:53)
>       at scala.concurrent.Await$.result(package.scala:107)
>       at org.apache.spark.util.AkkaUtils$.askWithReply(AkkaUtils.scala:195)
>       at org.apache.spark.executor.Executor$$anon$1.run(Executor.scala:427)
> 15/04/26 12:47:15 INFO MemoryStore: ensureFreeSpace(26227673) called with 
> curMem=265897, maxMem=5556991426
> 15/04/26 12:47:15 INFO MemoryStore: Block taskresult_92 stored as bytes in 
> memory (estimated size 25.0 MB, free 5.2 GB)
> 15/04/26 12:47:16 INFO MemoryStore: ensureFreeSpace(26272879) called with 
> curMem=26493570, maxMem=5556991426
> 15/04/26 12:47:16 INFO MemoryStore: Block taskresult_94 stored as bytes in 
> memory (estimated size 25.1 MB, free 5.1 GB)
> 15/04/26 12:47:18 INFO MemoryStore: ensureFreeSpace(26285327) called with 
> curMem=52766449, maxMem=5556991426
>
>
> and the job fails.
>
>
> I'm on AWS and have opened all ports. Also, since the 100MB file works, it
> should not be a connection issue.  I've a r3 xlarge and 2 m3 large.
>
> Can anyone suggest a way to fix this?
>
> --
> Regards,
> *Deepak Gopalakrishnan*
> *Mobile*:+918891509774
> *Skype* : deepakgk87
> http://myexps.blogspot.com
>
>

Reply via email to