Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-11-01 Thread kant kodali
Here is a UI of my thread dump.

http://fastthread.io/my-thread-report.jsp?p=c2hhcmVkLzIwMTYvMTEvMS8tLWpzdGFja19kdW1wX3dpbmRvd19pbnRlcnZhbF8xbWluX2JhdGNoX2ludGVydmFsXzFzLnR4dC0tNi0xNy00Ng==



On Mon, Oct 31, 2016 at 10:32 PM, kant kodali  wrote:

> Hi Vadim,
>
> Thank you so much this was a very useful command. This conversation is
> going on here
>
> https://www.mail-archive.com/user@spark.apache.org/msg58656.html
>
> or you can just google "
>
> why spark driver program is creating so many threads? How can I limit this
> number?
> 
> "
>
> please take a look if you are interested.
>
> Thanks a lot!
>
> On Mon, Oct 31, 2016 at 8:14 AM, Vadim Semenov <
> vadim.seme...@datadoghq.com> wrote:
>
>> Have you tried to get number of threads in a running process using `cat
>> /proc//status` ?
>>
>> On Sun, Oct 30, 2016 at 11:04 PM, kant kodali  wrote:
>>
>>> yes I did run ps -ef | grep "app_name" and it is root.
>>>
>>>
>>>
>>> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang 
>>> wrote:
>>>
 sorry, the UID

 On 10/31/16 11:59 AM, Chan Chor Pang wrote:

 actually if the max user processes is not the problem, i have no idea

 but i still suspecting the user,
 as the user who run spark-submit is not necessary the pid for the JVM
 process

 can u make sure when you "ps -ef | grep {your app id} " the PID is root?
 On 10/31/16 11:21 AM, kant kodali wrote:

 The java process is run by the root and it has the same config

 sudo -i

 ulimit -a

 core file size  (blocks, -c) 0
 data seg size   (kbytes, -d) unlimited
 scheduling priority (-e) 0
 file size   (blocks, -f) unlimited
 pending signals (-i) 120242
 max locked memory   (kbytes, -l) 64
 max memory size (kbytes, -m) unlimited
 open files  (-n) 1024
 pipe size(512 bytes, -p) 8
 POSIX message queues (bytes, -q) 819200
 real-time priority  (-r) 0
 stack size  (kbytes, -s) 8192
 cpu time   (seconds, -t) unlimited
 max user processes  (-u) 120242
 virtual memory  (kbytes, -v) unlimited
 file locks  (-x) unlimited



 On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang  wrote:

> I have the same Exception before and the problem fix after i change
> the nproc conf.
>
> > max user processes  (-u) 120242
> ↑this config does looks good.
> are u sure the user who run ulimit -a is the same user who run the
> Java process?
> depend on how u submit the job and your setting, spark job may execute
> by other user.
>
>
> On 10/31/16 10:38 AM, kant kodali wrote:
>
> when I did this
>
> cat /proc/sys/kernel/pid_max
>
> I got 32768
>
> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali 
> wrote:
>
>> I believe for ubuntu it is unlimited but I am not 100% sure (I just
>> read somewhere online). I ran ulimit -a and this is what I get
>>
>> core file size  (blocks, -c) 0
>> data seg size   (kbytes, -d) unlimited
>> scheduling priority (-e) 0
>> file size   (blocks, -f) unlimited
>> pending signals (-i) 120242
>> max locked memory   (kbytes, -l) 64
>> max memory size (kbytes, -m) unlimited
>> open files  (-n) 1024
>> pipe size(512 bytes, -p) 8
>> POSIX message queues (bytes, -q) 819200
>> real-time priority  (-r) 0
>> stack size  (kbytes, -s) 8192
>> cpu time   (seconds, -t) unlimited
>> max user processes  (-u) 120242
>> virtual memory  (kbytes, -v) unlimited
>> file locks  (-x) unlimited
>>
>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <
>> chin...@indetail.co.jp> wrote:
>>
>>> not sure for ubuntu, but i think you can just create the file by
>>> yourself
>>> the syntax will be the same as /etc/security/limits.conf
>>>
>>> nproc.conf not only limit java process but all process by the same
>>> user
>>> so even the jvm process does nothing,  if the corresponding user is
>>> busy in other way
>>> the jvm process will still not able to create new thread.
>>>
>>> btw the default limit for centos is 1024
>>>
>>>
>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>
>>>
>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
>>> 

Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-31 Thread kant kodali
Hi Vadim,

Thank you so much this was a very useful command. This conversation is
going on here

https://www.mail-archive.com/user@spark.apache.org/msg58656.html

or you can just google "

why spark driver program is creating so many threads? How can I limit this
number?

"

please take a look if you are interested.

Thanks a lot!

On Mon, Oct 31, 2016 at 8:14 AM, Vadim Semenov 
wrote:

> Have you tried to get number of threads in a running process using `cat
> /proc//status` ?
>
> On Sun, Oct 30, 2016 at 11:04 PM, kant kodali  wrote:
>
>> yes I did run ps -ef | grep "app_name" and it is root.
>>
>>
>>
>> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang 
>> wrote:
>>
>>> sorry, the UID
>>>
>>> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>>>
>>> actually if the max user processes is not the problem, i have no idea
>>>
>>> but i still suspecting the user,
>>> as the user who run spark-submit is not necessary the pid for the JVM
>>> process
>>>
>>> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>>> On 10/31/16 11:21 AM, kant kodali wrote:
>>>
>>> The java process is run by the root and it has the same config
>>>
>>> sudo -i
>>>
>>> ulimit -a
>>>
>>> core file size  (blocks, -c) 0
>>> data seg size   (kbytes, -d) unlimited
>>> scheduling priority (-e) 0
>>> file size   (blocks, -f) unlimited
>>> pending signals (-i) 120242
>>> max locked memory   (kbytes, -l) 64
>>> max memory size (kbytes, -m) unlimited
>>> open files  (-n) 1024
>>> pipe size(512 bytes, -p) 8
>>> POSIX message queues (bytes, -q) 819200
>>> real-time priority  (-r) 0
>>> stack size  (kbytes, -s) 8192
>>> cpu time   (seconds, -t) unlimited
>>> max user processes  (-u) 120242
>>> virtual memory  (kbytes, -v) unlimited
>>> file locks  (-x) unlimited
>>>
>>>
>>>
>>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
>>> wrote:
>>>
 I have the same Exception before and the problem fix after i change the
 nproc conf.

 > max user processes  (-u) 120242
 ↑this config does looks good.
 are u sure the user who run ulimit -a is the same user who run the Java
 process?
 depend on how u submit the job and your setting, spark job may execute
 by other user.


 On 10/31/16 10:38 AM, kant kodali wrote:

 when I did this

 cat /proc/sys/kernel/pid_max

 I got 32768

 On Sun, Oct 30, 2016 at 6:36 PM, kant kodali 
 wrote:

> I believe for ubuntu it is unlimited but I am not 100% sure (I just
> read somewhere online). I ran ulimit -a and this is what I get
>
> core file size  (blocks, -c) 0
> data seg size   (kbytes, -d) unlimited
> scheduling priority (-e) 0
> file size   (blocks, -f) unlimited
> pending signals (-i) 120242
> max locked memory   (kbytes, -l) 64
> max memory size (kbytes, -m) unlimited
> open files  (-n) 1024
> pipe size(512 bytes, -p) 8
> POSIX message queues (bytes, -q) 819200
> real-time priority  (-r) 0
> stack size  (kbytes, -s) 8192
> cpu time   (seconds, -t) unlimited
> max user processes  (-u) 120242
> virtual memory  (kbytes, -v) unlimited
> file locks  (-x) unlimited
>
> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang <
> chin...@indetail.co.jp> wrote:
>
>> not sure for ubuntu, but i think you can just create the file by
>> yourself
>> the syntax will be the same as /etc/security/limits.conf
>>
>> nproc.conf not only limit java process but all process by the same
>> user
>> so even the jvm process does nothing,  if the corresponding user is
>> busy in other way
>> the jvm process will still not able to create new thread.
>>
>> btw the default limit for centos is 1024
>>
>>
>> On 10/31/16 9:51 AM, kant kodali wrote:
>>
>>
>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
>> chin...@indetail.co.jp> wrote:
>>
>>> /etc/security/limits.d/90-nproc.conf
>>>
>>
>> Hi,
>>
>> I am using Ubuntu 16.04 LTS. I have this directory
>> /etc/security/limits.d/ but I don't have any files underneath it. This
>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>> issue? And I am thinking if I should use CMS. I have also posted this on 
>> SO
>> since I 

Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-31 Thread Vadim Semenov
Have you tried to get number of threads in a running process using `cat
/proc//status` ?

On Sun, Oct 30, 2016 at 11:04 PM, kant kodali  wrote:

> yes I did run ps -ef | grep "app_name" and it is root.
>
>
>
> On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang 
> wrote:
>
>> sorry, the UID
>>
>> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>>
>> actually if the max user processes is not the problem, i have no idea
>>
>> but i still suspecting the user,
>> as the user who run spark-submit is not necessary the pid for the JVM
>> process
>>
>> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
>> On 10/31/16 11:21 AM, kant kodali wrote:
>>
>> The java process is run by the root and it has the same config
>>
>> sudo -i
>>
>> ulimit -a
>>
>> core file size  (blocks, -c) 0
>> data seg size   (kbytes, -d) unlimited
>> scheduling priority (-e) 0
>> file size   (blocks, -f) unlimited
>> pending signals (-i) 120242
>> max locked memory   (kbytes, -l) 64
>> max memory size (kbytes, -m) unlimited
>> open files  (-n) 1024
>> pipe size(512 bytes, -p) 8
>> POSIX message queues (bytes, -q) 819200
>> real-time priority  (-r) 0
>> stack size  (kbytes, -s) 8192
>> cpu time   (seconds, -t) unlimited
>> max user processes  (-u) 120242
>> virtual memory  (kbytes, -v) unlimited
>> file locks  (-x) unlimited
>>
>>
>>
>> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
>> wrote:
>>
>>> I have the same Exception before and the problem fix after i change the
>>> nproc conf.
>>>
>>> > max user processes  (-u) 120242
>>> ↑this config does looks good.
>>> are u sure the user who run ulimit -a is the same user who run the Java
>>> process?
>>> depend on how u submit the job and your setting, spark job may execute
>>> by other user.
>>>
>>>
>>> On 10/31/16 10:38 AM, kant kodali wrote:
>>>
>>> when I did this
>>>
>>> cat /proc/sys/kernel/pid_max
>>>
>>> I got 32768
>>>
>>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali  wrote:
>>>
 I believe for ubuntu it is unlimited but I am not 100% sure (I just
 read somewhere online). I ran ulimit -a and this is what I get

 core file size  (blocks, -c) 0
 data seg size   (kbytes, -d) unlimited
 scheduling priority (-e) 0
 file size   (blocks, -f) unlimited
 pending signals (-i) 120242
 max locked memory   (kbytes, -l) 64
 max memory size (kbytes, -m) unlimited
 open files  (-n) 1024
 pipe size(512 bytes, -p) 8
 POSIX message queues (bytes, -q) 819200
 real-time priority  (-r) 0
 stack size  (kbytes, -s) 8192
 cpu time   (seconds, -t) unlimited
 max user processes  (-u) 120242
 virtual memory  (kbytes, -v) unlimited
 file locks  (-x) unlimited

 On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang  wrote:

> not sure for ubuntu, but i think you can just create the file by
> yourself
> the syntax will be the same as /etc/security/limits.conf
>
> nproc.conf not only limit java process but all process by the same user
> so even the jvm process does nothing,  if the corresponding user is
> busy in other way
> the jvm process will still not able to create new thread.
>
> btw the default limit for centos is 1024
>
>
> On 10/31/16 9:51 AM, kant kodali wrote:
>
>
> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang <
> chin...@indetail.co.jp> wrote:
>
>> /etc/security/limits.d/90-nproc.conf
>>
>
> Hi,
>
> I am using Ubuntu 16.04 LTS. I have this directory
> /etc/security/limits.d/ but I don't have any files underneath it. This
> error happens after running for 4 to 5 hours. I wonder if this is a GC
> issue? And I am thinking if I should use CMS. I have also posted this on 
> SO
> since I havent got much response for this question
> http://stackoverflow.com/questions/40315589/dag-sch
> eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
> e-new-native
>
>
> Thanks,
> kant
>
>
> --
> ---**---*---*---*---
> 株式会社INDETAIL
> ニアショア総合サービス事業本部
> ゲームサービス事業部
> 陳 楚鵬
> E-mail :chin...@indetail.co.jp
> URL : http://www.indetail.co.jp
>
> 【札幌本社/LABO/LABO2】
> 〒060-0042
> 札幌市中央区大通西9丁目3番地33
> キタコーセンタービルディング
> (札幌本社/LABO2:2階、LABO:9階)
> TEL:011-206-9235 FAX:011-206-9236
>
> 【東京支店】
> 〒108-0014
> 東京都港区芝5丁目29番20号 クロスオフィス三田
> TEL:03-6809-6502 FAX:03-6809-6504
>
> 

Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread kant kodali
yes I did run ps -ef | grep "app_name" and it is root.



On Sun, Oct 30, 2016 at 8:00 PM, Chan Chor Pang 
wrote:

> sorry, the UID
>
> On 10/31/16 11:59 AM, Chan Chor Pang wrote:
>
> actually if the max user processes is not the problem, i have no idea
>
> but i still suspecting the user,
> as the user who run spark-submit is not necessary the pid for the JVM
> process
>
> can u make sure when you "ps -ef | grep {your app id} " the PID is root?
> On 10/31/16 11:21 AM, kant kodali wrote:
>
> The java process is run by the root and it has the same config
>
> sudo -i
>
> ulimit -a
>
> core file size  (blocks, -c) 0
> data seg size   (kbytes, -d) unlimited
> scheduling priority (-e) 0
> file size   (blocks, -f) unlimited
> pending signals (-i) 120242
> max locked memory   (kbytes, -l) 64
> max memory size (kbytes, -m) unlimited
> open files  (-n) 1024
> pipe size(512 bytes, -p) 8
> POSIX message queues (bytes, -q) 819200
> real-time priority  (-r) 0
> stack size  (kbytes, -s) 8192
> cpu time   (seconds, -t) unlimited
> max user processes  (-u) 120242
> virtual memory  (kbytes, -v) unlimited
> file locks  (-x) unlimited
>
>
>
> On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
> wrote:
>
>> I have the same Exception before and the problem fix after i change the
>> nproc conf.
>>
>> > max user processes  (-u) 120242
>> ↑this config does looks good.
>> are u sure the user who run ulimit -a is the same user who run the Java
>> process?
>> depend on how u submit the job and your setting, spark job may execute by
>> other user.
>>
>>
>> On 10/31/16 10:38 AM, kant kodali wrote:
>>
>> when I did this
>>
>> cat /proc/sys/kernel/pid_max
>>
>> I got 32768
>>
>> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali  wrote:
>>
>>> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
>>> somewhere online). I ran ulimit -a and this is what I get
>>>
>>> core file size  (blocks, -c) 0
>>> data seg size   (kbytes, -d) unlimited
>>> scheduling priority (-e) 0
>>> file size   (blocks, -f) unlimited
>>> pending signals (-i) 120242
>>> max locked memory   (kbytes, -l) 64
>>> max memory size (kbytes, -m) unlimited
>>> open files  (-n) 1024
>>> pipe size(512 bytes, -p) 8
>>> POSIX message queues (bytes, -q) 819200
>>> real-time priority  (-r) 0
>>> stack size  (kbytes, -s) 8192
>>> cpu time   (seconds, -t) unlimited
>>> max user processes  (-u) 120242
>>> virtual memory  (kbytes, -v) unlimited
>>> file locks  (-x) unlimited
>>>
>>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang 
>>> wrote:
>>>
 not sure for ubuntu, but i think you can just create the file by
 yourself
 the syntax will be the same as /etc/security/limits.conf

 nproc.conf not only limit java process but all process by the same user
 so even the jvm process does nothing,  if the corresponding user is
 busy in other way
 the jvm process will still not able to create new thread.

 btw the default limit for centos is 1024


 On 10/31/16 9:51 AM, kant kodali wrote:


 On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang  wrote:

> /etc/security/limits.d/90-nproc.conf
>

 Hi,

 I am using Ubuntu 16.04 LTS. I have this directory
 /etc/security/limits.d/ but I don't have any files underneath it. This
 error happens after running for 4 to 5 hours. I wonder if this is a GC
 issue? And I am thinking if I should use CMS. I have also posted this on SO
 since I havent got much response for this question
 http://stackoverflow.com/questions/40315589/dag-sch
 eduler-event-loop-java-lang-outofmemoryerror-unable-to-creat
 e-new-native


 Thanks,
 kant


 --
 ---**---*---*---*---
 株式会社INDETAIL
 ニアショア総合サービス事業本部
 ゲームサービス事業部
 陳 楚鵬
 E-mail :chin...@indetail.co.jp
 URL : http://www.indetail.co.jp

 【札幌本社/LABO/LABO2】
 〒060-0042
 札幌市中央区大通西9丁目3番地33
 キタコーセンタービルディング
 (札幌本社/LABO2:2階、LABO:9階)
 TEL:011-206-9235 FAX:011-206-9236

 【東京支店】
 〒108-0014
 東京都港区芝5丁目29番20号 クロスオフィス三田
 TEL:03-6809-6502 FAX:03-6809-6504

 【名古屋サテライト】
 〒460-0002
 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
 TEL:052-971-0086




Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread Chan Chor Pang

sorry, the UID


On 10/31/16 11:59 AM, Chan Chor Pang wrote:


actually if the max user processes is not the problem, i have no idea

but i still suspecting the user,
as the user who run spark-submit is not necessary the pid for the JVM 
process


can u make sure when you "ps -ef | grep {your app id} " the PID is root?

On 10/31/16 11:21 AM, kant kodali wrote:

The java process is run by the root and it has the same config

sudo -i

ulimit -a

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited



On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
> wrote:


I have the same Exception before and the problem fix after i
change the nproc conf.

> max user processes (-u) 120242
↑this config does looks good.
are u sure the user who run ulimit -a is the same user who run
the Java process?
depend on how u submit the job and your setting, spark job may
execute by other user.


On 10/31/16 10:38 AM, kant kodali wrote:

when I did this

cat /proc/sys/kernel/pid_max

I got 32768


On Sun, Oct 30, 2016 at 6:36 PM, kant kodali > wrote:

I believe for ubuntu it is unlimited but I am not 100% sure
(I just read somewhere online). I ran ulimit -a and this is
what I get

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
> wrote:

not sure for ubuntu, but i think you can just create the
file by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process
by the same user

so even the jvm process does nothing,  if the
corresponding user is busy in other way
the jvm process will still not able to create new thread.

btw the default limit for centos is 1024


On 10/31/16 9:51 AM, kant kodali wrote:


On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
> wrote:

/etc/security/limits.d/90-nproc.conf


Hi,

I am using Ubuntu 16.04 LTS. I have this directory
/etc/security/limits.d/ but I don't have any files
underneath it. This error happens after running for 4
to 5 hours. I wonder if this is a GC issue? And I am
thinking if I should use CMS. I have also posted this
on SO since I havent got much response for this
question

http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native




Thanks,
kant


-- 
---**---*---*---*---

株式会社INDETAIL
ニアショア総合サービス事業本部
ゲームサービス事業部
陳 楚鵬
E-mail :chin...@indetail.co.jp 
URL :http://www.indetail.co.jp

【札幌本社/LABO/LABO2】
〒060-0042
札幌市中央区大通西9丁目3番地33
キタコーセンタービルディング
(札幌本社/LABO2:2階、LABO:9階)
TEL:011-206-9235 FAX:011-206-9236

【東京支店】
〒108-0014
東京都港区芝5丁目29番20号 クロスオフィス三田
TEL:03-6809-6502 FAX:03-6809-6504


Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread Chan Chor Pang

actually if the max user processes is not the problem, i have no idea

but i still suspecting the user,
as the user who run spark-submit is not necessary the pid for the JVM 
process


can u make sure when you "ps -ef | grep {your app id} " the PID is root?

On 10/31/16 11:21 AM, kant kodali wrote:

The java process is run by the root and it has the same config

sudo -i

ulimit -a

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited



On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
> wrote:


I have the same Exception before and the problem fix after i
change the nproc conf.

> max user processes (-u) 120242
↑this config does looks good.
are u sure the user who run ulimit -a is the same user who run the
Java process?
depend on how u submit the job and your setting, spark job may
execute by other user.


On 10/31/16 10:38 AM, kant kodali wrote:

when I did this

cat /proc/sys/kernel/pid_max

I got 32768


On Sun, Oct 30, 2016 at 6:36 PM, kant kodali > wrote:

I believe for ubuntu it is unlimited but I am not 100% sure
(I just read somewhere online). I ran ulimit -a and this is
what I get

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
> wrote:

not sure for ubuntu, but i think you can just create the
file by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process by
the same user

so even the jvm process does nothing,  if the
corresponding user is busy in other way
the jvm process will still not able to create new thread.

btw the default limit for centos is 1024


On 10/31/16 9:51 AM, kant kodali wrote:


On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
>
wrote:

/etc/security/limits.d/90-nproc.conf


Hi,

I am using Ubuntu 16.04 LTS. I have this directory
/etc/security/limits.d/ but I don't have any files
underneath it. This error happens after running for 4 to
5 hours. I wonder if this is a GC issue? And I am
thinking if I should use CMS. I have also posted this on
SO since I havent got much response for this question

http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native




Thanks,
kant


-- 
---**---*---*---*---

株式会社INDETAIL
ニアショア総合サービス事業本部
ゲームサービス事業部
陳 楚鵬
E-mail :chin...@indetail.co.jp 
URL :http://www.indetail.co.jp

【札幌本社/LABO/LABO2】
〒060-0042
札幌市中央区大通西9丁目3番地33
キタコーセンタービルディング
(札幌本社/LABO2:2階、LABO:9階)
TEL:011-206-9235 FAX:011-206-9236

【東京支店】
〒108-0014
東京都港区芝5丁目29番20号 クロスオフィス三田
TEL:03-6809-6502 FAX:03-6809-6504

  

Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread kant kodali
The java process is run by the root and it has the same config

sudo -i

ulimit -a

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited



On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang 
wrote:

> I have the same Exception before and the problem fix after i change the
> nproc conf.
>
> > max user processes  (-u) 120242
> ↑this config does looks good.
> are u sure the user who run ulimit -a is the same user who run the Java
> process?
> depend on how u submit the job and your setting, spark job may execute by
> other user.
>
>
> On 10/31/16 10:38 AM, kant kodali wrote:
>
> when I did this
>
> cat /proc/sys/kernel/pid_max
>
> I got 32768
>
> On Sun, Oct 30, 2016 at 6:36 PM, kant kodali  wrote:
>
>> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
>> somewhere online). I ran ulimit -a and this is what I get
>>
>> core file size  (blocks, -c) 0
>> data seg size   (kbytes, -d) unlimited
>> scheduling priority (-e) 0
>> file size   (blocks, -f) unlimited
>> pending signals (-i) 120242
>> max locked memory   (kbytes, -l) 64
>> max memory size (kbytes, -m) unlimited
>> open files  (-n) 1024
>> pipe size(512 bytes, -p) 8
>> POSIX message queues (bytes, -q) 819200
>> real-time priority  (-r) 0
>> stack size  (kbytes, -s) 8192
>> cpu time   (seconds, -t) unlimited
>> max user processes  (-u) 120242
>> virtual memory  (kbytes, -v) unlimited
>> file locks  (-x) unlimited
>>
>> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang 
>> wrote:
>>
>>> not sure for ubuntu, but i think you can just create the file by yourself
>>> the syntax will be the same as /etc/security/limits.conf
>>>
>>> nproc.conf not only limit java process but all process by the same user
>>> so even the jvm process does nothing,  if the corresponding user is busy
>>> in other way
>>> the jvm process will still not able to create new thread.
>>>
>>> btw the default limit for centos is 1024
>>>
>>>
>>> On 10/31/16 9:51 AM, kant kodali wrote:
>>>
>>>
>>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
>>> wrote:
>>>
 /etc/security/limits.d/90-nproc.conf

>>>
>>> Hi,
>>>
>>> I am using Ubuntu 16.04 LTS. I have this directory
>>> /etc/security/limits.d/ but I don't have any files underneath it. This
>>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>>> since I havent got much response for this question http://stackoverflow.
>>> com/questions/40315589/dag-scheduler-event-loop-java-lang-ou
>>> tofmemoryerror-unable-to-create-new-native
>>>
>>>
>>> Thanks,
>>> kant
>>>
>>>
>>> --
>>> ---**---*---*---*---
>>> 株式会社INDETAIL
>>> ニアショア総合サービス事業本部
>>> ゲームサービス事業部
>>> 陳 楚鵬
>>> E-mail :chin...@indetail.co.jp
>>> URL : http://www.indetail.co.jp
>>>
>>> 【札幌本社/LABO/LABO2】
>>> 〒060-0042
>>> 札幌市中央区大通西9丁目3番地33
>>> キタコーセンタービルディング
>>> (札幌本社/LABO2:2階、LABO:9階)
>>> TEL:011-206-9235 FAX:011-206-9236
>>>
>>> 【東京支店】
>>> 〒108-0014
>>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>>> TEL:03-6809-6502 FAX:03-6809-6504
>>>
>>> 【名古屋サテライト】
>>> 〒460-0002
>>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>>> TEL:052-971-0086
>>>
>>>


Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread Chan Chor Pang
I have the same Exception before and the problem fix after i change the 
nproc conf.


> max user processes  (-u) 120242
↑this config does looks good.
are u sure the user who run ulimit -a is the same user who run the Java 
process?
depend on how u submit the job and your setting, spark job may execute 
by other user.



On 10/31/16 10:38 AM, kant kodali wrote:

when I did this

cat /proc/sys/kernel/pid_max

I got 32768


On Sun, Oct 30, 2016 at 6:36 PM, kant kodali > wrote:


I believe for ubuntu it is unlimited but I am not 100% sure (I
just read somewhere online). I ran ulimit -a and this is what I get

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
> wrote:

not sure for ubuntu, but i think you can just create the file
by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process by the
same user

so even the jvm process does nothing,  if the corresponding
user is busy in other way
the jvm process will still not able to create new thread.

btw the default limit for centos is 1024


On 10/31/16 9:51 AM, kant kodali wrote:


On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
> wrote:

/etc/security/limits.d/90-nproc.conf


Hi,

I am using Ubuntu 16.04 LTS. I have this directory
/etc/security/limits.d/ but I don't have any files underneath
it. This error happens after running for 4 to 5 hours. I
wonder if this is a GC issue? And I am thinking if I should
use CMS. I have also posted this on SO since I havent got
much response for this question

http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native




Thanks,
kant


-- 
---**---*---*---*---

株式会社INDETAIL
ニアショア総合サービス事業本部
ゲームサービス事業部
陳 楚鵬
E-mail :chin...@indetail.co.jp 
URL :http://www.indetail.co.jp

【札幌本社/LABO/LABO2】
〒060-0042
札幌市中央区大通西9丁目3番地33
キタコーセンタービルディング
(札幌本社/LABO2:2階、LABO:9階)
TEL:011-206-9235 FAX:011-206-9236

【東京支店】
〒108-0014
東京都港区芝5丁目29番20号 クロスオフィス三田
TEL:03-6809-6502 FAX:03-6809-6504

【名古屋サテライト】
〒460-0002
愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
TEL:052-971-0086



Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread kant kodali
when I did this

cat /proc/sys/kernel/pid_max

I got 32768

On Sun, Oct 30, 2016 at 6:36 PM, kant kodali  wrote:

> I believe for ubuntu it is unlimited but I am not 100% sure (I just read
> somewhere online). I ran ulimit -a and this is what I get
>
> core file size  (blocks, -c) 0
> data seg size   (kbytes, -d) unlimited
> scheduling priority (-e) 0
> file size   (blocks, -f) unlimited
> pending signals (-i) 120242
> max locked memory   (kbytes, -l) 64
> max memory size (kbytes, -m) unlimited
> open files  (-n) 1024
> pipe size(512 bytes, -p) 8
> POSIX message queues (bytes, -q) 819200
> real-time priority  (-r) 0
> stack size  (kbytes, -s) 8192
> cpu time   (seconds, -t) unlimited
> max user processes  (-u) 120242
> virtual memory  (kbytes, -v) unlimited
> file locks  (-x) unlimited
>
> On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang 
> wrote:
>
>> not sure for ubuntu, but i think you can just create the file by yourself
>> the syntax will be the same as /etc/security/limits.conf
>>
>> nproc.conf not only limit java process but all process by the same user
>> so even the jvm process does nothing,  if the corresponding user is busy
>> in other way
>> the jvm process will still not able to create new thread.
>>
>> btw the default limit for centos is 1024
>>
>>
>> On 10/31/16 9:51 AM, kant kodali wrote:
>>
>>
>> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
>> wrote:
>>
>>> /etc/security/limits.d/90-nproc.conf
>>>
>>
>> Hi,
>>
>> I am using Ubuntu 16.04 LTS. I have this directory
>> /etc/security/limits.d/ but I don't have any files underneath it. This
>> error happens after running for 4 to 5 hours. I wonder if this is a GC
>> issue? And I am thinking if I should use CMS. I have also posted this on SO
>> since I havent got much response for this question http://stackoverflow.
>> com/questions/40315589/dag-scheduler-event-loop-java-lang-
>> outofmemoryerror-unable-to-create-new-native
>>
>>
>> Thanks,
>> kant
>>
>>
>> --
>> ---**---*---*---*---
>> 株式会社INDETAIL
>> ニアショア総合サービス事業本部
>> ゲームサービス事業部
>> 陳 楚鵬
>> E-mail :chin...@indetail.co.jp
>> URL : http://www.indetail.co.jp
>>
>> 【札幌本社/LABO/LABO2】
>> 〒060-0042
>> 札幌市中央区大通西9丁目3番地33
>> キタコーセンタービルディング
>> (札幌本社/LABO2:2階、LABO:9階)
>> TEL:011-206-9235 FAX:011-206-9236
>>
>> 【東京支店】
>> 〒108-0014
>> 東京都港区芝5丁目29番20号 クロスオフィス三田
>> TEL:03-6809-6502 FAX:03-6809-6504
>>
>> 【名古屋サテライト】
>> 〒460-0002
>> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
>> TEL:052-971-0086
>>
>>
>


Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread kant kodali
I believe for ubuntu it is unlimited but I am not 100% sure (I just read
somewhere online). I ran ulimit -a and this is what I get

core file size  (blocks, -c) 0
data seg size   (kbytes, -d) unlimited
scheduling priority (-e) 0
file size   (blocks, -f) unlimited
pending signals (-i) 120242
max locked memory   (kbytes, -l) 64
max memory size (kbytes, -m) unlimited
open files  (-n) 1024
pipe size(512 bytes, -p) 8
POSIX message queues (bytes, -q) 819200
real-time priority  (-r) 0
stack size  (kbytes, -s) 8192
cpu time   (seconds, -t) unlimited
max user processes  (-u) 120242
virtual memory  (kbytes, -v) unlimited
file locks  (-x) unlimited

On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang 
wrote:

> not sure for ubuntu, but i think you can just create the file by yourself
> the syntax will be the same as /etc/security/limits.conf
>
> nproc.conf not only limit java process but all process by the same user
> so even the jvm process does nothing,  if the corresponding user is busy
> in other way
> the jvm process will still not able to create new thread.
>
> btw the default limit for centos is 1024
>
>
> On 10/31/16 9:51 AM, kant kodali wrote:
>
>
> On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
> wrote:
>
>> /etc/security/limits.d/90-nproc.conf
>>
>
> Hi,
>
> I am using Ubuntu 16.04 LTS. I have this directory /etc/security/limits.d/
> but I don't have any files underneath it. This error happens after running
> for 4 to 5 hours. I wonder if this is a GC issue? And I am thinking if I
> should use CMS. I have also posted this on SO since I havent got much
> response for this question http://stackoverflow.
> com/questions/40315589/dag-scheduler-event-loop-java-
> lang-outofmemoryerror-unable-to-create-new-native
>
>
> Thanks,
> kant
>
>
> --
> ---**---*---*---*---
> 株式会社INDETAIL
> ニアショア総合サービス事業本部
> ゲームサービス事業部
> 陳 楚鵬
> E-mail :chin...@indetail.co.jp
> URL : http://www.indetail.co.jp
>
> 【札幌本社/LABO/LABO2】
> 〒060-0042
> 札幌市中央区大通西9丁目3番地33
> キタコーセンタービルディング
> (札幌本社/LABO2:2階、LABO:9階)
> TEL:011-206-9235 FAX:011-206-9236
>
> 【東京支店】
> 〒108-0014
> 東京都港区芝5丁目29番20号 クロスオフィス三田
> TEL:03-6809-6502 FAX:03-6809-6504
>
> 【名古屋サテライト】
> 〒460-0002
> 愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
> TEL:052-971-0086
>
>


Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread Chan Chor Pang

not sure for ubuntu, but i think you can just create the file by yourself
the syntax will be the same as /etc/security/limits.conf

nproc.conf not only limit java process but all process by the same user

so even the jvm process does nothing,  if the corresponding user is busy 
in other way

the jvm process will still not able to create new thread.

btw the default limit for centos is 1024

On 10/31/16 9:51 AM, kant kodali wrote:


On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
> wrote:


/etc/security/limits.d/90-nproc.conf


Hi,

I am using Ubuntu 16.04 LTS. I have this directory 
/etc/security/limits.d/ but I don't have any files underneath it. This 
error happens after running for 4 to 5 hours. I wonder if this is a GC 
issue? And I am thinking if I should use CMS. I have also posted this 
on SO since I havent got much response for this question 
http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native



Thanks,
kant


--
---**---*---*---*---
株式会社INDETAIL
ニアショア総合サービス事業本部
ゲームサービス事業部
陳 楚鵬
E-mail :chin...@indetail.co.jp
URL : http://www.indetail.co.jp

【札幌本社/LABO/LABO2】
〒060-0042
札幌市中央区大通西9丁目3番地33
キタコーセンタービルディング
(札幌本社/LABO2:2階、LABO:9階)
TEL:011-206-9235 FAX:011-206-9236

【東京支店】
〒108-0014
東京都港区芝5丁目29番20号 クロスオフィス三田
TEL:03-6809-6502 FAX:03-6809-6504

【名古屋サテライト】
〒460-0002
愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
TEL:052-971-0086



Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread kant kodali
On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang 
wrote:

> /etc/security/limits.d/90-nproc.conf
>

Hi,

I am using Ubuntu 16.04 LTS. I have this directory /etc/security/limits.d/
but I don't have any files underneath it. This error happens after running
for 4 to 5 hours. I wonder if this is a GC issue? And I am thinking if I
should use CMS. I have also posted this on SO since I havent got much
response for this question
http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native


Thanks,
kant


Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-30 Thread Chan Chor Pang
you may want to check the process limit of the user who responsible for 
starting the JVM.

/etc/security/limits.d/90-nproc.conf


On 10/29/16 4:47 AM, kant kodali wrote:
 "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to 
create new native thread

at java.lang.Thread.start0(Native Method)
at java.lang.Thread.start(Thread.java:714)
at 
scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoinPool.java:1672)
at 
scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPool.java:1966)
at 
scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJoinPool.java:1072)
at 
scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.java:654)

at scala.collection.parallel.ForkJoinTasks$WrappedTask$

This is the error produced by the Spark Driver program which is 
running on client mode by default so some people say just increase the 
heap size by passing the --driver-memory 3g flag however the message 
*"**unable to create new native thread**"*  really says that the JVM 
is asking OS to create a new thread but OS couldn't allocate it 
anymore and the number of threads a JVM can create by requesting OS is 
platform dependent but typically it is 32K threads on a 64-bit JVM. so 
I am wondering why spark is even creating so many threads and how do I 
control this number?




Re: java.lang.OutOfMemoryError: unable to create new native thread

2016-10-29 Thread kant kodali
Another thing I forgot to mention is that it happens after running for
several hours say (4 to 5 hours) I am not sure why it is creating so many
threads? any way to control them?

On Fri, Oct 28, 2016 at 12:47 PM, kant kodali  wrote:

>  "dag-scheduler-event-loop" java.lang.OutOfMemoryError: unable to create
> new native thread
> at java.lang.Thread.start0(Native Method)
> at java.lang.Thread.start(Thread.java:714)
> at scala.concurrent.forkjoin.ForkJoinPool.tryAddWorker(ForkJoin
> Pool.java:1672)
> at scala.concurrent.forkjoin.ForkJoinPool.signalWork(ForkJoinPo
> ol.java:1966)
> at scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.push(ForkJo
> inPool.java:1072)
> at scala.concurrent.forkjoin.ForkJoinTask.fork(ForkJoinTask.
> java:654)
> at scala.collection.parallel.ForkJoinTasks$WrappedTask$
>
> This is the error produced by the Spark Driver program which is running on
> client mode by default so some people say just increase the heap size by
> passing the --driver-memory 3g flag however the message *"**unable to
> create new native thread**"*  really says that the JVM is asking OS to
> create a new thread but OS couldn't allocate it anymore and the number of
> threads a JVM can create by requesting OS is platform dependent but
> typically it is 32K threads on a 64-bit JVM. so I am wondering why spark is
> even creating so many threads and how do I control this number?
>


Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-25 Thread ๏̯͡๏
I have a YARN cluster where the max memory allowed is 16GB. I set 12G for
my driver, however i see OutOFMemory error even for this program
http://spark.apache.org/docs/1.3.0/sql-programming-guide.html#hive-tables .
What do you suggest ?

On Wed, Mar 25, 2015 at 8:23 AM, Thomas Gerber thomas.ger...@radius.com
wrote:

 So,

 1. I reduced my  -XX:ThreadStackSize to 5m (instead of 10m - default is
 1m), which is still OK for my need.
 2. I reduced the executor memory to 44GB for a 60GB machine (instead of
 49GB).

 This seems to have helped. Thanks to Matthew and Sean.

 Thomas

 On Tue, Mar 24, 2015 at 3:49 PM, Matt Silvey matt.sil...@videoamp.com
 wrote:

 My memory is hazy on this but aren't there hidden limitations to
 Linux-based threads?  I ran into some issues a couple of years ago where,
 and here is the fuzzy part, the kernel wants to reserve virtual memory per
 thread equal to the stack size.  When the total amount of reserved memory
 (not necessarily resident memory) exceeds the memory of the system it
 throws an OOM.  I'm looking for material to back this up.  Sorry for the
 initial vague response.

 Matthew

 On Tue, Mar 24, 2015 at 12:53 PM, Thomas Gerber thomas.ger...@radius.com
  wrote:

 Additional notes:
 I did not find anything wrong with the number of threads (ps -u USER -L
 | wc -l): around 780 on the master and 400 on executors. I am running on
 100 r3.2xlarge.

 On Tue, Mar 24, 2015 at 12:38 PM, Thomas Gerber 
 thomas.ger...@radius.com wrote:

 Hello,

 I am seeing various crashes in spark on large jobs which all share a
 similar exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and executor
 extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas







-- 
Deepak


Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-25 Thread Matt Silvey
This is a different kind of error.  Thomas' OOM error was specific to the
kernel refusing to create another thread/process for his application.

Matthew

On Wed, Mar 25, 2015 at 10:51 AM, ÐΞ€ρ@Ҝ (๏̯͡๏) deepuj...@gmail.com wrote:

 I have a YARN cluster where the max memory allowed is 16GB. I set 12G for
 my driver, however i see OutOFMemory error even for this program
 http://spark.apache.org/docs/1.3.0/sql-programming-guide.html#hive-tables
 . What do you suggest ?

 On Wed, Mar 25, 2015 at 8:23 AM, Thomas Gerber thomas.ger...@radius.com
 wrote:

 So,

 1. I reduced my  -XX:ThreadStackSize to 5m (instead of 10m - default is
 1m), which is still OK for my need.
 2. I reduced the executor memory to 44GB for a 60GB machine (instead of
 49GB).

 This seems to have helped. Thanks to Matthew and Sean.

 Thomas

 On Tue, Mar 24, 2015 at 3:49 PM, Matt Silvey matt.sil...@videoamp.com
 wrote:

 My memory is hazy on this but aren't there hidden limitations to
 Linux-based threads?  I ran into some issues a couple of years ago where,
 and here is the fuzzy part, the kernel wants to reserve virtual memory per
 thread equal to the stack size.  When the total amount of reserved memory
 (not necessarily resident memory) exceeds the memory of the system it
 throws an OOM.  I'm looking for material to back this up.  Sorry for the
 initial vague response.

 Matthew

 On Tue, Mar 24, 2015 at 12:53 PM, Thomas Gerber 
 thomas.ger...@radius.com wrote:

 Additional notes:
 I did not find anything wrong with the number of threads (ps -u USER -L
 | wc -l): around 780 on the master and 400 on executors. I am running on
 100 r3.2xlarge.

 On Tue, Mar 24, 2015 at 12:38 PM, Thomas Gerber 
 thomas.ger...@radius.com wrote:

 Hello,

 I am seeing various crashes in spark on large jobs which all share a
 similar exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and
 executor extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas







 --
 Deepak




Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-24 Thread Matt Silvey
My memory is hazy on this but aren't there hidden limitations to
Linux-based threads?  I ran into some issues a couple of years ago where,
and here is the fuzzy part, the kernel wants to reserve virtual memory per
thread equal to the stack size.  When the total amount of reserved memory
(not necessarily resident memory) exceeds the memory of the system it
throws an OOM.  I'm looking for material to back this up.  Sorry for the
initial vague response.

Matthew

On Tue, Mar 24, 2015 at 12:53 PM, Thomas Gerber thomas.ger...@radius.com
wrote:

 Additional notes:
 I did not find anything wrong with the number of threads (ps -u USER -L |
 wc -l): around 780 on the master and 400 on executors. I am running on 100
 r3.2xlarge.

 On Tue, Mar 24, 2015 at 12:38 PM, Thomas Gerber thomas.ger...@radius.com
 wrote:

 Hello,

 I am seeing various crashes in spark on large jobs which all share a
 similar exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and executor
 extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas





Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-24 Thread Thomas Gerber
So,

1. I reduced my  -XX:ThreadStackSize to 5m (instead of 10m - default is
1m), which is still OK for my need.
2. I reduced the executor memory to 44GB for a 60GB machine (instead of
49GB).

This seems to have helped. Thanks to Matthew and Sean.

Thomas

On Tue, Mar 24, 2015 at 3:49 PM, Matt Silvey matt.sil...@videoamp.com
wrote:

 My memory is hazy on this but aren't there hidden limitations to
 Linux-based threads?  I ran into some issues a couple of years ago where,
 and here is the fuzzy part, the kernel wants to reserve virtual memory per
 thread equal to the stack size.  When the total amount of reserved memory
 (not necessarily resident memory) exceeds the memory of the system it
 throws an OOM.  I'm looking for material to back this up.  Sorry for the
 initial vague response.

 Matthew

 On Tue, Mar 24, 2015 at 12:53 PM, Thomas Gerber thomas.ger...@radius.com
 wrote:

 Additional notes:
 I did not find anything wrong with the number of threads (ps -u USER -L |
 wc -l): around 780 on the master and 400 on executors. I am running on 100
 r3.2xlarge.

 On Tue, Mar 24, 2015 at 12:38 PM, Thomas Gerber thomas.ger...@radius.com
  wrote:

 Hello,

 I am seeing various crashes in spark on large jobs which all share a
 similar exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and executor
 extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas






Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-24 Thread Sean Owen
I doubt you're hitting the limit of threads you can spawn, but as you
say, running out of memory that the JVM process is allowed to allocate
since your threads are grabbing stacks 10x bigger than usual. The
thread stacks are 4GB by themselves.

I suppose you can't not up the stack size so much?

If so then I think you need to make more, smaller executors instead?

On Tue, Mar 24, 2015 at 7:38 PM, Thomas Gerber thomas.ger...@radius.com wrote:
 Hello,

 I am seeing various crashes in spark on large jobs which all share a similar
 exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and executor
 extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: java.lang.OutOfMemoryError: unable to create new native thread

2015-03-24 Thread Thomas Gerber
Additional notes:
I did not find anything wrong with the number of threads (ps -u USER -L |
wc -l): around 780 on the master and 400 on executors. I am running on 100
r3.2xlarge.

On Tue, Mar 24, 2015 at 12:38 PM, Thomas Gerber thomas.ger...@radius.com
wrote:

 Hello,

 I am seeing various crashes in spark on large jobs which all share a
 similar exception:

 java.lang.OutOfMemoryError: unable to create new native thread
 at java.lang.Thread.start0(Native Method)
 at java.lang.Thread.start(Thread.java:714)

 I increased nproc (i.e. ulimit -u) 10 fold, but it doesn't help.

 Does anyone know how to avoid those kinds of errors?

 Noteworthy: I added -XX:ThreadStackSize=10m on both driver and executor
 extra java options, which might have amplified the problem.

 Thanks for you help,
 Thomas