Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
this email, your message will be added to the discussion >> below: >> http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Ja >> va-Heap-Error-tp27669p27707.html >> To unsubscribe from Spark Java Heap Error, click here >> <http://apache-spark-user-list.1001560.n3.

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
t; below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27707.html > To unsubscribe from Spark Java Heap Error, click here > <http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_code=27669=dHIubWFuaX

Re: Spark Java Heap Error

2016-09-13 Thread neil90
o df.cache(StorageLevel.MEMORY_AND_DISK). -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27707.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To uns

Re: Spark Java Heap Error

2016-09-13 Thread Baktaawar
> Memory is close to half of 16gb available. > > -- > If you reply to this email, your message will be added to the discussion > below: > http://apache-spark-user-list.1001560.n3.nabble.com/Spark- > Java-Heap-Error-tp27669p27704.html > To unsubscribe

Re: Spark Java Heap Error

2016-09-13 Thread neil90
Double check your Driver Memory in your Spark Web UI make sure the driver Memory is close to half of 16gb available. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27704.html Sent from the Apache Spark User List mailing list

Re: Spark Java Heap Error

2016-09-12 Thread Baktaawar
va:745) -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error-tp27669p27696.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe e-mail: user-unsubscr...@spark.apache.org

Re: Spark Java Heap Error

2016-09-09 Thread Baktaawar
avaOptions -XX:+PrintGCDetails -Dkey=value > > You might need to change your spark.driver.maxResultSize settings if you > plan on doing a collect on the entire rdd/dataframe. > > -- > If you reply to this email, your message will be added to the discussion > b

Re: Spark Java Heap Error

2016-09-07 Thread neil90
spark.executor.extraJavaOptions -XX:+PrintGCDetails -Dkey=value You might need to change your spark.driver.maxResultSize settings if you plan on doing a collect on the entire rdd/dataframe. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Spark-Java-Heap-Error