Re: Saprk 1.6 Driver Memory Issue

2016-06-01 Thread kali.tumm...@gmail.com
Hi , I am using spark-sql shell wile launching I am running it as spark-sql --conf spark.driver.maxResultSize=20g I tried using spark-sql --conf "spark.driver.maxResults"="20g" but still no luck do I need to use set command something like spark-sql --conf set "spark.driver.maxReults"="20g"

Re: Saprk 1.6 Driver Memory Issue

2016-06-01 Thread ashesh_28
Hi Karthik , You must set the value before the SparkContext (sc) is created. Also don't assign too much overhead like 20g for maxResultSize , You can set it to 2G maximum as per your error message. Also if you are using Java 1.8 , Please add the below section in your Yarn-site.xml

Re: Saprk 1.6 Driver Memory Issue

2016-06-01 Thread Kishoore MV
Hi Kali, In the shuffle stage maximum memory is 2GB(1024 MB).in your error it is expecting more memory.can you let me know your cluster config details. Thanks & Regards Kishore M > On 01-Jun-2016, at 9:11 PM, "kali.tumm...@gmail.com" > wrote: > > Hi All , > > I am

Saprk 1.6 Driver Memory Issue

2016-06-01 Thread kali.tumm...@gmail.com
Hi All , I am getting spark driver memory issue even after overriding the conf by using --conf spark.driver.maxResultSize=20g and I also mentioned in my sql script (set spark.driver.maxResultSize =16;) but still the same error happening. Job aborted due to stage failure: Total size of