Re: OS killing Executor due to high (possibly off heap) memory usage

2017-01-03 Thread Koert Kuipers
6 GB would mean more data load into >> memory and more GC, which can cause issues. >> >> >> >> Also, have you tried to persist data in any way? If so, then that might >> be causing an issue. >> >> >> >> Lastly, I am not sure if your data has a skew and if

Re: OS killing Executor due to high (possibly off heap) memory usage

2016-12-08 Thread Aniket Bhatnagar
Sent from my Windows 10 phone > > > > *From: *Rodrick Brown <rodr...@orchardplatform.com> > *Sent: *Friday, November 25, 2016 12:25 AM > *To: *Aniket Bhatnagar <aniket.bhatna...@gmail.com> > *Cc: *user <user@spark.apache.org> > *Subject: *Re: OS killing Execu

Re: OS killing Executor due to high (possibly off heap) memory usage

2016-11-26 Thread Koert Kuipers
not sure if your data has a skew and if that is forcing a lot > of data to be on one executor node. > > > > Sent from my Windows 10 phone > > > > *From: *Rodrick Brown <rodr...@orchardplatform.com> > *Sent: *Friday, November 25, 2016 12:25 AM > *To: *Aniket

Re: OS killing Executor due to high (possibly off heap) memory usage

2016-11-25 Thread Aniket Bhatnagar
f data to be on one executor node. Sent from my Windows 10 phone *From: *Rodrick Brown <rodr...@orchardplatform.com> *Sent: *Friday, November 25, 2016 12:25 AM *To: *Aniket Bhatnagar <aniket.bhatna...@gmail.com> *Cc: *user <user@spark.apache.org> *Subject: *Re: OS killing Executor due to

RE: OS killing Executor due to high (possibly off heap) memory usage

2016-11-24 Thread Shreya Agarwal
gt; Cc: user<mailto:user@spark.apache.org> Subject: Re: OS killing Executor due to high (possibly off heap) memory usage Try setting spark.yarn.executor.memoryOverhead 1 On Thu, Nov 24, 2016 at 11:16 AM, Aniket Bhatnagar <aniket.bhatna...@gmail.com<mailto:aniket.bhatna...@gmail.com>

Re: OS killing Executor due to high (possibly off heap) memory usage

2016-11-24 Thread Rodrick Brown
Try setting spark.yarn.executor.memoryOverhead 1 On Thu, Nov 24, 2016 at 11:16 AM, Aniket Bhatnagar < aniket.bhatna...@gmail.com> wrote: > Hi Spark users > > I am running a job that does join of a huge dataset (7 TB+) and the > executors keep crashing randomly, eventually causing the job to

OS killing Executor due to high (possibly off heap) memory usage

2016-11-24 Thread Aniket Bhatnagar
Hi Spark users I am running a job that does join of a huge dataset (7 TB+) and the executors keep crashing randomly, eventually causing the job to crash. There are no out of memory exceptions in the log and looking at the dmesg output, it seems like the OS killed the JVM because of high memory