Hi,

I figured out my problem so I wanted to share my findings. I was basically
trying to broadcast an array with 4 million elements, and a size of
approximatively 150 MB. Every time I was trying to broadcast, I got an
OutOfMemory error. I fixed my problem by increasing the driver memory using:
export SPARK_MEM="2g"

Using SPARK_DAEMON_MEM or spark.executor.memory did not help in this case! I
don't have a good understanding of all these settings and I have the feeling
many people are in the same situation. 



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/driver-memory-tp10486p10489.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to