Re: issue with spark.driver.maxResultSize parameter in spark 1.3

2015-11-01 Thread karthik kadiyam
Did any one had issue setting spark.driver.maxResultSize value ? On Friday, October 30, 2015, karthik kadiyam wrote: > Hi Shahid, > > I played around with spark driver memory too. In the conf file it was set > to " --driver-memory 20G " first. When i changed the

Re: issue with spark.driver.maxResultSize parameter in spark 1.3

2015-11-01 Thread shahid ashraf
Is your process getting killed... if yes then try to see using dmesg. On Mon, Nov 2, 2015 at 8:17 AM, karthik kadiyam < karthik.kadiyam...@gmail.com> wrote: > Did any one had issue setting spark.driver.maxResultSize value ? > > On Friday, October 30, 2015, karthik kadiyam

Re: issue with spark.driver.maxResultSize parameter in spark 1.3

2015-10-30 Thread karthik kadiyam
Hi Shahid, I played around with spark driver memory too. In the conf file it was set to " --driver-memory 20G " first. When i changed the spark driver maxResultSize from default to 2g ,i changed the driver memory to 30G and tired too. It gave we same error says "bigger than

Re: issue with spark.driver.maxResultSize parameter in spark 1.3

2015-10-29 Thread shahid ashraf
Hi I guess you need to increase spark driver memory as well. But that should be set in conf files Let me know if that resolves On Oct 30, 2015 7:33 AM, "karthik kadiyam" wrote: > Hi, > > In spark streaming job i had the following setting > >

issue with spark.driver.maxResultSize parameter in spark 1.3

2015-10-29 Thread karthik kadiyam
Hi, In spark streaming job i had the following setting this.jsc.getConf().set("spark.driver.maxResultSize", “0”); and i got the error in the job as below User class threw exception: Job aborted due to stage failure: Total size of serialized results of 120 tasks (1082.2 MB) is bigger