One way to fix the issue is to set ("spark.executor.memory", "8g") in your
SparkConf

ex code :

 val conf = new SparkConf()
      .set("spark.executor.memory", "8g")
      .set("spark.locality.wait", "10000")
    val sc = new SparkContext(master, "whatever", conf)



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Size-issue-tp1355p1420.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Reply via email to