sorry, the UID

On 10/31/16 11:59 AM, Chan Chor Pang wrote:

actually if the max user processes is not the problem, i have no idea

but i still suspecting the user,
as the user who run spark-submit is not necessary the pid for the JVM process

can u make sure when you "ps -ef | grep {your app id} " the PID is root?

On 10/31/16 11:21 AM, kant kodali wrote:
The java process is run by the root and it has the same config

sudo -i

ulimit -a

core file size          (blocks, -c) 0
data seg size           (kbytes, -d) unlimited
scheduling priority             (-e) 0
file size               (blocks, -f) unlimited
pending signals                 (-i) 120242
max locked memory       (kbytes, -l) 64
max memory size         (kbytes, -m) unlimited
open files                      (-n) 1024
pipe size            (512 bytes, -p) 8
POSIX message queues     (bytes, -q) 819200
real-time priority              (-r) 0
stack size              (kbytes, -s) 8192
cpu time               (seconds, -t) unlimited
max user processes              (-u) 120242
virtual memory          (kbytes, -v) unlimited
file locks                      (-x) unlimited



On Sun, Oct 30, 2016 at 7:01 PM, Chan Chor Pang <chin...@indetail.co.jp <mailto:chin...@indetail.co.jp>> wrote:

    I have the same Exception before and the problem fix after i
    change the nproc conf.

    > max user processes (-u) 120242
    ↑this config does looks good.
    are u sure the user who run ulimit -a is the same user who run
    the Java process?
    depend on how u submit the job and your setting, spark job may
    execute by other user.


    On 10/31/16 10:38 AM, kant kodali wrote:
    when I did this

    cat /proc/sys/kernel/pid_max

    I got 32768


    On Sun, Oct 30, 2016 at 6:36 PM, kant kodali <kanth...@gmail.com
    <mailto:kanth...@gmail.com>> wrote:

        I believe for ubuntu it is unlimited but I am not 100% sure
        (I just read somewhere online). I ran ulimit -a and this is
        what I get

        core file size          (blocks, -c) 0
        data seg size           (kbytes, -d) unlimited
        scheduling priority (-e) 0
        file size               (blocks, -f) unlimited
        pending signals (-i) 120242
        max locked memory       (kbytes, -l) 64
        max memory size         (kbytes, -m) unlimited
        open files  (-n) 1024
        pipe size            (512 bytes, -p) 8
        POSIX message queues     (bytes, -q) 819200
        real-time priority  (-r) 0
        stack size              (kbytes, -s) 8192
        cpu time               (seconds, -t) unlimited
        max user processes  (-u) 120242
        virtual memory          (kbytes, -v) unlimited
        file locks  (-x) unlimited

        On Sun, Oct 30, 2016 at 6:15 PM, Chan Chor Pang
        <chin...@indetail.co.jp <mailto:chin...@indetail.co.jp>> wrote:

            not sure for ubuntu, but i think you can just create the
            file by yourself
            the syntax will be the same as /etc/security/limits.conf

            nproc.conf not only limit java process but all process
            by the same user

            so even the jvm process does nothing,  if the
            corresponding user is busy in other way
            the jvm process will still not able to create new thread.

            btw the default limit for centos is 1024


            On 10/31/16 9:51 AM, kant kodali wrote:

            On Sun, Oct 30, 2016 at 5:22 PM, Chan Chor Pang
            <chin...@indetail.co.jp
            <mailto:chin...@indetail.co.jp>> wrote:

                /etc/security/limits.d/90-nproc.conf


            Hi,

            I am using Ubuntu 16.04 LTS. I have this directory
            /etc/security/limits.d/ but I don't have any files
            underneath it. This error happens after running for 4
            to 5 hours. I wonder if this is a GC issue? And I am
            thinking if I should use CMS. I have also posted this
            on SO since I havent got much response for this
            question
            
http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native
            
<http://stackoverflow.com/questions/40315589/dag-scheduler-event-loop-java-lang-outofmemoryerror-unable-to-create-new-native>


            Thanks,
            kant

-- ---*------------------------------------------------*---*---*---*---
            株式会社INDETAIL
            ニアショア総合サービス事業本部
            ゲームサービス事業部
            陳 楚鵬
            E-mail :chin...@indetail.co.jp <mailto:chin...@indetail.co.jp>
            URL :http://www.indetail.co.jp

            【札幌本社/LABO/LABO2】
            〒060-0042
            札幌市中央区大通西9丁目3番地33
            キタコーセンタービルディング
            (札幌本社/LABO2:2階、LABO:9階)
            TEL:011-206-9235 FAX:011-206-9236

            【東京支店】
            〒108-0014
            東京都港区芝5丁目29番20号 クロスオフィス三田
            TEL:03-6809-6502 FAX:03-6809-6504

            【名古屋サテライト】
            〒460-0002
            愛知県名古屋市中区丸の内3丁目17番24号 NAYUTA BLD
            TEL:052-971-0086

Reply via email to