Hello guys, 

I am trying to run a Ramdom Forest on 30MB of data. I have a cluster of 4
machines. Each machine has 106 MB of RAM and 16 cores. 

I am getting: 
15/02/11 11:01:23 ERROR ActorSystemImpl: Uncaught fatal error from thread
[sparkDriver-akka.actor.default-dispatcher-3] shutting down ActorSystem
[sparkDriver]
java.lang.OutOfMemoryError: Java heap space

That's very weird. Any idea of what's wrong with my configuration? 

PS : I am running Spark 1.2



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/OutOfMemoryError-with-ramdom-forest-and-small-training-dataset-tp21598.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to