Hi All , I am getting spark driver memory issue even after overriding the conf by using --conf spark.driver.maxResultSize=20g and I also mentioned in my sql script (set spark.driver.maxResultSize =16;) but still the same error happening.
Job aborted due to stage failure: Total size of serialized results of 79 tasks (1035.2 MB) is bigger than spark.driver.maxResultSize (1024.0 MB) Any thoughts ? Thanks Sri -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Saprk-1-6-Driver-Memory-Issue-tp27063.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org