Re: libhdfs SIGSEGV error
Hello Pete! Sorry, I forgot to answer to your gdb related question. I ran the tests under the debugger. In case of 100MB buffer it received a SIGSEGV signal at the jni_SetByteArrayRegion function. Any clue what this function does or why it fails? Thank you. Cheers, Tamas --- On Thu, 11/6/08, Pete Wyckoff [EMAIL PROTECTED] wrote: From: Pete Wyckoff [EMAIL PROTECTED] Subject: Re: libhdfs SIGSEGV error To: , [EMAIL PROTECTED]@yahoo.com, core-user@hadoop.apache.org core-user@hadoop.apache.org Date: Thursday, November 6, 2008, 7:20 PM Hi Tamas, Have you tried using the supplied hdfs_write executable includes in the distribution? Also, I didn't understand your comment about using hdfsJniHelper.c - that should be used only by hdfs.c itself. Also, what version of hadoop is this? I haven't seen this problem at least in hadoop 0.17. And have you run this under gdb? -- pete On 11/6/08 10:30 AM, Tamás Szokol [EMAIL PROTECTED] wrote: Hello! I'd like to ask your help in a libhdfs related problem. I'm trying to perform HDFS tests from C by using the libhdfs API. I created a test program, that measures the creation times of 1MB, 10MB, 100MB and 1GB large files. The test runs well for 1 MB and 10 MB but as soon as I reach to 100MB I receive a SIGSEGV error: == # # An unexpected error has been detected by Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x7fbead12a32c, pid=6918, tid=140456938362592 # # Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b22 mixed mode linux-amd64) # Problematic frame: # V [libjvm.so+0x36d32c] # # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # --- T H R E A D --- Current thread (0x00609800): JavaThread main [_thread_in_vm, id=6918, stack(0x7fffb5cc2000,0x7fffb5ec2000)] siginfo:si_signo=SIGSEGV: si_errno=0, si_code=1 (SEGV_MAPERR), si_addr=0x Registers: RAX=0x0090, RBX=0x00609800, RCX=0x7fbead60d1e0, RDX=0x7fbead447780 RSP=0x7fffb5ec1240, RBP=0x7fffb5ec12c0, RSI=0x, RDI=0x00609800 R8 =0x7fbe7dcff010, R9 =0x0400, R10=0x7fbead610c50, R11=0x7fbea35180c0 R12=0x0630, R13=0x, R14=0x, R15=0x7fffb5ec1250 RIP=0x7fbead12a32c, EFL=0x00010246, CSGSFS=0x0033, ERR=0x0004 TRAPNO=0x000e Top of Stack: (sp=0x7fffb5ec1240) 0x7fffb5ec1240: 00609998 7fbe7dcff010 0x7fffb5ec1250: 00609800 0060a1f0 0x7fffb5ec1260: 0090 7fbead447780 0x7fffb5ec1270: 00609800 00602720 0x7fffb5ec1280: 00609800 00609998 0x7fffb5ec1290: 00609800 7fbe84014220 0x7fffb5ec12a0: 00609998 7fbe7dcff010 0x7fffb5ec12b0: 006f7848 0x7fffb5ec12c0: 0630 7fbeada9ccff 0x7fffb5ec12d0: 006f7830 0x7fffb5ec12e0: 0060a660 0x7fffb5ec12f0: 0021 0x7fffb5ec1300: 7fffb5ec1370 0x7fffb5ec1310: 7fffb5ec14b0 0x7fffb5ec1320: 00400e37 0x7fffb5ec1330: 7fbe7dcff010 0x7fffb5ec1340: 0631 00602720 0x7fffb5ec1350: 063b 7fbe84014220 0x7fffb5ec1360: 006f7840 0021 0x7fffb5ec1370: 7fffb5ec1390 00400cdd 0x7fffb5ec1380: 7fbe7dcff010 001800401126 0x7fffb5ec1390: 7fffb5ec13d0 00400b1b 0x7fffb5ec13a0: 003c 7fffb5ec14b8 0x7fffb5ec13b0: 000100401040 00602280 0x7fffb5ec13c0: 0022 7fbeadec0c00 0x7fffb5ec13d0: 7fbead7561c4 0x7fffb5ec13e0: 004009c0 7fffb5ec14b8 0x7fffb5ec13f0: 0001 00400a78 0x7fffb5ec1400: 7fbeadec0c00 0c1103e05d01fe14 0x7fffb5ec1410: 7fffb5ec14b0 0x7fffb5ec1420: 0x7fffb5ec1430: f3ee68387ac1fe14 f36c590a9e14 Instructions: (pc=0x7fbead12a32c) 0x7fbead12a31c: 5d 90 48 83 7b 08 00 0f 85 5d 01 00 00 45 85 ed 0x7fbead12a32c: 49 8b 36 78 13 45 85 e4 78 0e 47 8d 1c 2c 44 3b Stack: [0x7fffb5cc2000,0x7fffb5ec2000], sp=0x7fffb5ec1240, free space=2044k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) V [libjvm.so+0x36d32c] --- P R O C E S S --- Java Threads: ( = current thread ) 0x0069f800 JavaThread Low Memory Detector daemon [_thread_blocked, id=6925, stack(0x7fbe8b1b5000,0x7fbe9b1b6000)] 0x0069dc00 JavaThread CompilerThread1 daemon
Re: libhdfs SIGSEGV error
Hi Pete! Thank you for your response. Have you tried using the supplied hdfs_write executable includes in the distribution?I wrote a similar one to perform my tests. I'm gonna try this default one. I'll get back to you with the results. I didn't understand your comment about using hdfsJniHelper.c - that should be used only by hdfs.c itself. That is correct. But the libhdfs shared library contains hdfs.c implementation so when calling libhdfs functions than the JNI layer in hdfsJniHelper.c will fork a new JVM and performs the appropriate Java calls to hdfs. Please correct me if I'm wrong. Since the SIGSEGV error comes in case of rlimit stack problem I thought that I should supply additional memory parameters to the JVM: -Xss256M -Xoss256M -XX:ThreadStackSize=262144 -Xms128M -Xmx128M I added them as extra parameters to JavaVMOption which is used by JNI_CreateJavaVM function. Also, what version of hadoop is this? I'm using hadoop-0.19 from the svn. Today I will perform tests on the latest stable 0.18.2 version. If you have seen this SIGSEGV error formerly could you describe me the former reason and the solution/workaround? Thank you. Cheers, Tamas --- On Thu, 11/6/08, Pete Wyckoff [EMAIL PROTECTED] wrote: From: Pete Wyckoff [EMAIL PROTECTED] Subject: Re: libhdfs SIGSEGV error To: , [EMAIL PROTECTED]@yahoo.com, core-user@hadoop.apache.org core-user@hadoop.apache.org Date: Thursday, November 6, 2008, 7:20 PM Hi Tamas, Have you tried using the supplied hdfs_write executable includes in the distribution? Also, I didn't understand your comment about using hdfsJniHelper.c - that should be used only by hdfs.c itself. Also, what version of hadoop is this? I haven't seen this problem at least in hadoop 0.17. And have you run this under gdb? -- pete On 11/6/08 10:30 AM, Tamás Szokol [EMAIL PROTECTED] wrote: Hello! I'd like to ask your help in a libhdfs related problem. I'm trying to perform HDFS tests from C by using the libhdfs API. I created a test program, that measures the creation times of 1MB, 10MB, 100MB and 1GB large files. The test runs well for 1 MB and 10 MB but as soon as I reach to 100MB I receive a SIGSEGV error: == # # An unexpected error has been detected by Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x7fbead12a32c, pid=6918, tid=140456938362592 # # Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b22 mixed mode linux-amd64) # Problematic frame: # V [libjvm.so+0x36d32c] # # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # --- T H R E A D --- Current thread (0x00609800): JavaThread main [_thread_in_vm, id=6918, stack(0x7fffb5cc2000,0x7fffb5ec2000)] siginfo:si_signo=SIGSEGV: si_errno=0, si_code=1 (SEGV_MAPERR), si_addr=0x Registers: RAX=0x0090, RBX=0x00609800, RCX=0x7fbead60d1e0, RDX=0x7fbead447780 RSP=0x7fffb5ec1240, RBP=0x7fffb5ec12c0, RSI=0x, RDI=0x00609800 R8 =0x7fbe7dcff010, R9 =0x0400, R10=0x7fbead610c50, R11=0x7fbea35180c0 R12=0x0630, R13=0x, R14=0x, R15=0x7fffb5ec1250 RIP=0x7fbead12a32c, EFL=0x00010246, CSGSFS=0x0033, ERR=0x0004 TRAPNO=0x000e Top of Stack: (sp=0x7fffb5ec1240) 0x7fffb5ec1240: 00609998 7fbe7dcff010 0x7fffb5ec1250: 00609800 0060a1f0 0x7fffb5ec1260: 0090 7fbead447780 0x7fffb5ec1270: 00609800 00602720 0x7fffb5ec1280: 00609800 00609998 0x7fffb5ec1290: 00609800 7fbe84014220 0x7fffb5ec12a0: 00609998 7fbe7dcff010 0x7fffb5ec12b0: 006f7848 0x7fffb5ec12c0: 0630 7fbeada9ccff 0x7fffb5ec12d0: 006f7830 0x7fffb5ec12e0: 0060a660 0x7fffb5ec12f0: 0021 0x7fffb5ec1300: 7fffb5ec1370 0x7fffb5ec1310: 7fffb5ec14b0 0x7fffb5ec1320: 00400e37 0x7fffb5ec1330: 7fbe7dcff010 0x7fffb5ec1340: 0631 00602720 0x7fffb5ec1350: 063b 7fbe84014220 0x7fffb5ec1360: 006f7840 0021 0x7fffb5ec1370: 7fffb5ec1390 00400cdd 0x7fffb5ec1380: 7fbe7dcff010 001800401126 0x7fffb5ec1390: 7fffb5ec13d0 00400b1b 0x7fffb5ec13a0: 003c 7fffb5ec14b8 0x7fffb5ec13b0: 000100401040 00602280 0x7fffb5ec13c0: 0022 7fbeadec0c00 0x7fffb5ec13d0: 7fbead7561c4 0x7fffb5ec13e0: 004009c0 7fffb5ec14b8 0x7fffb5ec13f0
Re: libhdfs SIGSEGV error
Hi Tomas, I cannot reproduce the problem myself - even writing 100 GB file is fine. I am running on amd64 too with 64 bit jvm. I did have one other question and just wanted to ensure you are compiling libhdfs with -m64 ? I.e., did you edit its Makefile and replace -m32 with -m64? Pete On 11/7/08 5:44 AM, Tamás Szokol [EMAIL PROTECTED] wrote: Hello Pete! Sorry, I forgot to answer to your gdb related question. I ran the tests under the debugger. In case of 100MB buffer it received a SIGSEGV signal at the jni_SetByteArrayRegion function. Any clue what this function does or why it fails? Thank you. Cheers, Tamas --- On Thu, 11/6/08, Pete Wyckoff [EMAIL PROTECTED] wrote: From: Pete Wyckoff [EMAIL PROTECTED] Subject: Re: libhdfs SIGSEGV error To: , [EMAIL PROTECTED]@yahoo.com, core-user@hadoop.apache.org core-user@hadoop.apache.org Date: Thursday, November 6, 2008, 7:20 PM Hi Tamas, Have you tried using the supplied hdfs_write executable includes in the distribution? Also, I didn't understand your comment about using hdfsJniHelper.c - that should be used only by hdfs.c itself. Also, what version of hadoop is this? I haven't seen this problem at least in hadoop 0.17. And have you run this under gdb? -- pete On 11/6/08 10:30 AM, Tamás Szokol [EMAIL PROTECTED] wrote: Hello! I'd like to ask your help in a libhdfs related problem. I'm trying to perform HDFS tests from C by using the libhdfs API. I created a test program, that measures the creation times of 1MB, 10MB, 100MB and 1GB large files. The test runs well for 1 MB and 10 MB but as soon as I reach to 100MB I receive a SIGSEGV error: == # # An unexpected error has been detected by Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x7fbead12a32c, pid=6918, tid=140456938362592 # # Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b22 mixed mode linux-amd64) # Problematic frame: # V [libjvm.so+0x36d32c] # # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # --- T H R E A D --- Current thread (0x00609800): JavaThread main [_thread_in_vm, id=6918, stack(0x7fffb5cc2000,0x7fffb5ec2000)] siginfo:si_signo=SIGSEGV: si_errno=0, si_code=1 (SEGV_MAPERR), si_addr=0x Registers: RAX=0x0090, RBX=0x00609800, RCX=0x7fbead60d1e0, RDX=0x7fbead447780 RSP=0x7fffb5ec1240, RBP=0x7fffb5ec12c0, RSI=0x, RDI=0x00609800 R8 =0x7fbe7dcff010, R9 =0x0400, R10=0x7fbead610c50, R11=0x7fbea35180c0 R12=0x0630, R13=0x, R14=0x, R15=0x7fffb5ec1250 RIP=0x7fbead12a32c, EFL=0x00010246, CSGSFS=0x0033, ERR=0x0004 TRAPNO=0x000e Top of Stack: (sp=0x7fffb5ec1240) 0x7fffb5ec1240: 00609998 7fbe7dcff010 0x7fffb5ec1250: 00609800 0060a1f0 0x7fffb5ec1260: 0090 7fbead447780 0x7fffb5ec1270: 00609800 00602720 0x7fffb5ec1280: 00609800 00609998 0x7fffb5ec1290: 00609800 7fbe84014220 0x7fffb5ec12a0: 00609998 7fbe7dcff010 0x7fffb5ec12b0: 006f7848 0x7fffb5ec12c0: 0630 7fbeada9ccff 0x7fffb5ec12d0: 006f7830 0x7fffb5ec12e0: 0060a660 0x7fffb5ec12f0: 0021 0x7fffb5ec1300: 7fffb5ec1370 0x7fffb5ec1310: 7fffb5ec14b0 0x7fffb5ec1320: 00400e37 0x7fffb5ec1330: 7fbe7dcff010 0x7fffb5ec1340: 0631 00602720 0x7fffb5ec1350: 063b 7fbe84014220 0x7fffb5ec1360: 006f7840 0021 0x7fffb5ec1370: 7fffb5ec1390 00400cdd 0x7fffb5ec1380: 7fbe7dcff010 001800401126 0x7fffb5ec1390: 7fffb5ec13d0 00400b1b 0x7fffb5ec13a0: 003c 7fffb5ec14b8 0x7fffb5ec13b0: 000100401040 00602280 0x7fffb5ec13c0: 0022 7fbeadec0c00 0x7fffb5ec13d0: 7fbead7561c4 0x7fffb5ec13e0: 004009c0 7fffb5ec14b8 0x7fffb5ec13f0: 0001 00400a78 0x7fffb5ec1400: 7fbeadec0c00 0c1103e05d01fe14 0x7fffb5ec1410: 7fffb5ec14b0 0x7fffb5ec1420: 0x7fffb5ec1430: f3ee68387ac1fe14 f36c590a9e14 Instructions: (pc=0x7fbead12a32c) 0x7fbead12a31c: 5d 90 48 83 7b 08 00 0f 85 5d 01 00 00 45 85 ed 0x7fbead12a32c: 49 8b 36 78 13 45 85 e4 78 0e 47 8d 1c 2c 44 3b Stack: [0x7fffb5cc2000,0x7fffb5ec2000], sp=0x7fffb5ec1240, free space=2044k Native frames: (J
Re: libhdfs SIGSEGV error
Hi Tamas, Have you tried using the supplied hdfs_write executable includes in the distribution? Also, I didn't understand your comment about using hdfsJniHelper.c - that should be used only by hdfs.c itself. Also, what version of hadoop is this? I haven't seen this problem at least in hadoop 0.17. And have you run this under gdb? -- pete On 11/6/08 10:30 AM, Tamás Szokol [EMAIL PROTECTED] wrote: Hello! I'd like to ask your help in a libhdfs related problem. I'm trying to perform HDFS tests from C by using the libhdfs API. I created a test program, that measures the creation times of 1MB, 10MB, 100MB and 1GB large files. The test runs well for 1 MB and 10 MB but as soon as I reach to 100MB I receive a SIGSEGV error: == # # An unexpected error has been detected by Java Runtime Environment: # # SIGSEGV (0xb) at pc=0x7fbead12a32c, pid=6918, tid=140456938362592 # # Java VM: Java HotSpot(TM) 64-Bit Server VM (10.0-b22 mixed mode linux-amd64) # Problematic frame: # V [libjvm.so+0x36d32c] # # If you would like to submit a bug report, please visit: # http://java.sun.com/webapps/bugreport/crash.jsp # --- T H R E A D --- Current thread (0x00609800): JavaThread main [_thread_in_vm, id=6918, stack(0x7fffb5cc2000,0x7fffb5ec2000)] siginfo:si_signo=SIGSEGV: si_errno=0, si_code=1 (SEGV_MAPERR), si_addr=0x Registers: RAX=0x0090, RBX=0x00609800, RCX=0x7fbead60d1e0, RDX=0x7fbead447780 RSP=0x7fffb5ec1240, RBP=0x7fffb5ec12c0, RSI=0x, RDI=0x00609800 R8 =0x7fbe7dcff010, R9 =0x0400, R10=0x7fbead610c50, R11=0x7fbea35180c0 R12=0x0630, R13=0x, R14=0x, R15=0x7fffb5ec1250 RIP=0x7fbead12a32c, EFL=0x00010246, CSGSFS=0x0033, ERR=0x0004 TRAPNO=0x000e Top of Stack: (sp=0x7fffb5ec1240) 0x7fffb5ec1240: 00609998 7fbe7dcff010 0x7fffb5ec1250: 00609800 0060a1f0 0x7fffb5ec1260: 0090 7fbead447780 0x7fffb5ec1270: 00609800 00602720 0x7fffb5ec1280: 00609800 00609998 0x7fffb5ec1290: 00609800 7fbe84014220 0x7fffb5ec12a0: 00609998 7fbe7dcff010 0x7fffb5ec12b0: 006f7848 0x7fffb5ec12c0: 0630 7fbeada9ccff 0x7fffb5ec12d0: 006f7830 0x7fffb5ec12e0: 0060a660 0x7fffb5ec12f0: 0021 0x7fffb5ec1300: 7fffb5ec1370 0x7fffb5ec1310: 7fffb5ec14b0 0x7fffb5ec1320: 00400e37 0x7fffb5ec1330: 7fbe7dcff010 0x7fffb5ec1340: 0631 00602720 0x7fffb5ec1350: 063b 7fbe84014220 0x7fffb5ec1360: 006f7840 0021 0x7fffb5ec1370: 7fffb5ec1390 00400cdd 0x7fffb5ec1380: 7fbe7dcff010 001800401126 0x7fffb5ec1390: 7fffb5ec13d0 00400b1b 0x7fffb5ec13a0: 003c 7fffb5ec14b8 0x7fffb5ec13b0: 000100401040 00602280 0x7fffb5ec13c0: 0022 7fbeadec0c00 0x7fffb5ec13d0: 7fbead7561c4 0x7fffb5ec13e0: 004009c0 7fffb5ec14b8 0x7fffb5ec13f0: 0001 00400a78 0x7fffb5ec1400: 7fbeadec0c00 0c1103e05d01fe14 0x7fffb5ec1410: 7fffb5ec14b0 0x7fffb5ec1420: 0x7fffb5ec1430: f3ee68387ac1fe14 f36c590a9e14 Instructions: (pc=0x7fbead12a32c) 0x7fbead12a31c: 5d 90 48 83 7b 08 00 0f 85 5d 01 00 00 45 85 ed 0x7fbead12a32c: 49 8b 36 78 13 45 85 e4 78 0e 47 8d 1c 2c 44 3b Stack: [0x7fffb5cc2000,0x7fffb5ec2000], sp=0x7fffb5ec1240, free space=2044k Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code) V [libjvm.so+0x36d32c] --- P R O C E S S --- Java Threads: ( = current thread ) 0x0069f800 JavaThread Low Memory Detector daemon [_thread_blocked, id=6925, stack(0x7fbe8b1b5000,0x7fbe9b1b6000)] 0x0069dc00 JavaThread CompilerThread1 daemon [_thread_blocked, id=6924, stack(0x41c3e000,0x41d3f000)] 0x00692400 JavaThread CompilerThread0 daemon [_thread_blocked, id=6923, stack(0x41b3d000,0x41c3e000)] 0x00690c00 JavaThread Signal Dispatcher daemon [_thread_blocked, id=6922, stack(0x61e4c000,0x71e4d000)] 0x0066cc00 JavaThread Finalizer daemon [_thread_blocked, id=6921, stack(0x51e4b000,0x61e4c000)] 0x0066b800 JavaThread Reference Handler daemon