Thanks Akhil. When I set it up in spark-defaults.conf it didn't work . But
when I passed it as a property for spark-submit it worked.

--conf spark.driver.maxResultSize=3G .

Also because I'm trying reset configuration in the code and I think it work
this away . As you were saying , creating the sparkcontext and setting the
property should work.

Thanks
Karthik

On Monday, November 9, 2015, Akhil Das <ak...@sigmoidanalytics.com> wrote:

> You can set it in your conf/spark-defaults.conf file, or you will have to
> set it before you create the SparkContext.
>
> Thanks
> Best Regards
>
> On Fri, Oct 30, 2015 at 4:31 AM, karthik kadiyam <
> karthik.kadiyam...@gmail.com
> <javascript:_e(%7B%7D,'cvml','karthik.kadiyam...@gmail.com');>> wrote:
>
>> Hi,
>>
>> In spark streaming job i had the following setting
>>
>>             this.jsc.getConf().set("spark.driver.maxResultSize", “0”);
>> and i got the error in the job as below
>>
>> User class threw exception: Job aborted due to stage failure: Total size
>> of serialized results of 120 tasks (1082.2 MB) is bigger than
>> spark.driver.maxResultSize (1024.0 MB)
>>
>> Basically i realized that as default value is 1 GB. I changed
>> the configuration as below.
>>
>> this.jsc.getConf().set("spark.driver.maxResultSize", “2g”);
>>
>> and when i ran the job it gave the error
>>
>> User class threw exception: Job aborted due to stage failure: Total size
>> of serialized results of 120 tasks (1082.2 MB) is bigger than
>> spark.driver.maxResultSize (1024.0 MB)
>>
>> So, basically the change i made is not been considered in the job. so my
>> question is
>>
>> - "spark.driver.maxResultSize", “2g” is this the right way to change or
>> any other way to do it.
>> - Is this a bug in spark 1.3 or something or any one had this issue
>> before?
>>
>
>

Reply via email to