Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-02-13 Thread StanZhai
I've filed a JIRA about this problem. https://issues.apache.org/jira/browse/SPARK-19532 I've tried to set `spark.speculation` to `false`, but the off-heap also exceed about 10G after triggering a FullGC to the Executor

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-02-07 Thread StanZhai
>From thread dump page of Executor of WebUI, I found that there are about 1300 threads named "DataStreamer for file /test/data/test_temp/_temporary/0/_temporary/attempt_20170207172435_80750_m_69_1/part-00069-690407af-0900-46b1-9590-a6d6c696fe68.snappy.parquet" in TIMED_WAITING state like

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-02-03 Thread Jacek Laskowski
Hi, Just to throw few zlotys to the conversation, I believe that Spark Standalone does not enforce any memory checks to limit or even kill executors beyond requested memory (like YARN). I also found that memory does not have much of use while scheduling tasks and CPU matters only. My

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-02-02 Thread StanZhai
CentOS 7.1, Linux version 3.10.0-229.el7.x86_64 (buil...@kbuilder.dev.centos.org) (gcc version 4.8.2 20140120 (Red Hat 4.8.2-16) (GCC) ) #1 SMP Fri Mar 6 11:36:42 UTC 2015 Michael Allman-2 wrote > Hi Stan, > > What OS/version are you using? > > Michael > >> On Jan 22, 2017, at 11:36 PM,

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-01-23 Thread Michael Allman
Hi Stan, What OS/version are you using? Michael > On Jan 22, 2017, at 11:36 PM, StanZhai wrote: > > I'm using Parallel GC. > rxin wrote >> Are you using G1 GC? G1 sometimes uses a lot more memory than the size >> allocated. >> >> >> On Sun, Jan 22, 2017 at 12:58 AM

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-01-22 Thread StanZhai
I'm using Parallel GC. rxin wrote > Are you using G1 GC? G1 sometimes uses a lot more memory than the size > allocated. > > > On Sun, Jan 22, 2017 at 12:58 AM StanZhai > mail@ > wrote: > >> Hi all, >> >> >> >> We just upgraded our Spark from 1.6.2 to 2.1.0. >> >> >> >> Our Spark application

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-01-22 Thread Koert Kuipers
could this be related to SPARK-18787? On Sun, Jan 22, 2017 at 1:45 PM, Reynold Xin wrote: > Are you using G1 GC? G1 sometimes uses a lot more memory than the size > allocated. > > > On Sun, Jan 22, 2017 at 12:58 AM StanZhai wrote: > >> Hi all, >> >> >>

Re: Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-01-22 Thread Reynold Xin
Are you using G1 GC? G1 sometimes uses a lot more memory than the size allocated. On Sun, Jan 22, 2017 at 12:58 AM StanZhai wrote: > Hi all, > > > > We just upgraded our Spark from 1.6.2 to 2.1.0. > > > > Our Spark application is started by spark-submit with config of > >

Executors exceed maximum memory defined with `--executor-memory` in Spark 2.1.0

2017-01-22 Thread StanZhai
Hi all, We just upgraded our Spark from 1.6.2 to 2.1.0. Our Spark application is started by spark-submit with config of `--executor-memory 35G` in standalone model, but the actual use of memory up to 65G after a full gc(jmap -histo:live $pid) as follow: test@c6 ~ $ ps aux | grep