Re: Dataset : Issue with Save

2017-03-17 Thread Yong Zhang
From: Bahubali Jain <bahub...@gmail.com> Sent: Thursday, March 16, 2017 11:41 PM To: Yong Zhang Cc: user@spark.apache.org Subject: Re: Dataset : Issue with Save I am using SPARK 2.0 . There are comments in the ticket since Oct-2016 which clearly mention that issue

Re: Dataset : Issue with Save

2017-03-16 Thread Bahubali Jain
Sent:* Thursday, March 16, 2017 10:34 PM > *To:* Yong Zhang > *Cc:* user@spark.apache.org > *Subject:* Re: Dataset : Issue with Save > > Hi, > Was this not yet resolved? > Its a very common requirement to save a dataframe, is there a better way > to save a dataframe by avoidin

Re: Dataset : Issue with Save

2017-03-16 Thread Yong Zhang
; Sent: Thursday, March 16, 2017 10:34 PM To: Yong Zhang Cc: user@spark.apache.org Subject: Re: Dataset : Issue with Save Hi, Was this not yet resolved? Its a very common requirement to save a dataframe, is there a better way to save a dataframe by avoiding data being sent to driver?. "

Re: Dataset : Issue with Save

2017-03-16 Thread Bahubali Jain
ain <bahub...@gmail.com> > *Sent:* Thursday, March 16, 2017 1:39 PM > *To:* user@spark.apache.org > *Subject:* Dataset : Issue with Save > > Hi, > While saving a dataset using * > mydataset.write().csv("outputlocation") * I am running &g

Re: Dataset : Issue with Save

2017-03-16 Thread Yong Zhang
memory space for the driver even there are no requests to collect data back to the driver. From: Bahubali Jain <bahub...@gmail.com> Sent: Thursday, March 16, 2017 1:39 PM To: user@spark.apache.org Subject: Dataset : Issue with Save Hi, While saving a d

Dataset : Issue with Save

2017-03-16 Thread Bahubali Jain
Hi, While saving a dataset using * mydataset.write().csv("outputlocation") * I am running into an exception *"Total size of serialized results of 3722 tasks (1024.0 MB) is bigger than spark.driver.maxResultSize (1024.0 MB)"* Does it mean that for saving a dataset whole of