Did any one had issue setting spark.driver.maxResultSize value ?
On Friday, October 30, 2015, karthik kadiyam <karthik.kadiyam...@gmail.com>
wrote:
> Hi Shahid,
>
> I played around with spark driver memory too. In the conf file it was set
> to " --driver-memory 20G
wrote:
> Hi
> I guess you need to increase spark driver memory as well. But that should
> be set in conf files
> Let me know if that resolves
> On Oct 30, 2015 7:33 AM, "karthik kadiyam" <karthik.kadiyam...@gmail.com>
> wrote:
>
Hi,
In spark streaming job i had the following setting
this.jsc.getConf().set("spark.driver.maxResultSize", “0”);
and i got the error in the job as below
User class threw exception: Job aborted due to stage failure: Total size of
serialized results of 120 tasks (1082.2 MB) is
Hi,
In spark streaming job i had the following setting
this.jsc.getConf().set("spark.driver.maxResultSize", “0”);
and i got the error in the job as below
User class threw exception: Job aborted due to stage failure: Total size of
serialized results of 120 tasks (1082.2 MB) is bigger