Hi Kali, In the shuffle stage maximum memory is 2GB(1024 MB).in your error it is expecting more memory.can you let me know your cluster config details.
Thanks & Regards Kishore M > On 01-Jun-2016, at 9:11 PM, "kali.tumm...@gmail.com" <kali.tumm...@gmail.com> > wrote: > > Hi All , > > I am getting spark driver memory issue even after overriding the conf by > using --conf spark.driver.maxResultSize=20g and I also mentioned in my sql > script (set spark.driver.maxResultSize =16;) but still the same error > happening. > > Job aborted due to stage failure: Total size of serialized results of 79 > tasks (1035.2 MB) is bigger than spark.driver.maxResultSize (1024.0 MB) > > Any thoughts ? > > Thanks > Sri > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Saprk-1-6-Driver-Memory-Issue-tp27063.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > --------------------------------------------------------------------- > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org