I got a core dump when I used spark 1.1.0 . The enviorment is shown here.
software enviorment: OS: cent os 6.3 jvm : Java HotSpot(TM) 64-Bit Server VM(build 21.0-b17,mixed mode) hardware enviorment: memory: 64G I run three spark process with jvm -Xmx args like this: -Xmx 28G -Xmx 2G -Xmx 14G The process which has core dump is with -Xmx 14G parameter. I found the core dump happend when the IO was very high . The gdb core dump bt data is like this : (gdb) bt #0 0x0000003fc04328a5 in raise () from /lib64/libc.so.6 #1 0x0000003fc0434085 in abort () from /lib64/libc.so.6 #2 0x0000003fc046fa37 in __libc_message () from /lib64/libc.so.6 #3 0x0000003fc0475366 in malloc_printerr () from /lib64/libc.so.6 #4 0x00002ad53773dad0 in Java_java_io_UnixFileSystem_getLength () from /apps/svr/jdk-7/jre/lib/amd64/libjava.so #5 0x00002ad53c5c9b5c in ?? () #6 0x000000049d9908e0 in ?? () #7 0x00000004413d7030 in ?? () #8 0x00000004498f77e0 in ?? () #9 0x00002ad53c3dff18 in ?? () #10 0x00000004498f77e0 in ?? () #11 0x00002ad53c7c1668 in ?? () #12 0x00000004413d7690 in ?? () #13 0x00000004413d7678 in ?? () #14 0x0000000000000000 in ?? () Aditionally ,I set the jvm parameter of spark executor like this : -XX:SurvivorRatio=8 -XX:MaxPermSize=1g -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/apps/logs/spark/ -XX:+UseCompressedOops -XX:+UseConcMarkSweepGC -XX:+UseParNewGC -XX:SoftRefLRUPolicyMSPerMB=0 -XX:+UseCMSCompactAtFullCollection -XX:CMSFullGCsBeforeCompaction=0 -XX:+CMSParallelRemarkEnabled -XX:+UseCMSInitiatingOccupancyOnly -XX:CMSInitiatingOccupancyFraction=70 -XX:+CMSConcurrentMTEnabled -XX:+DisableExplicitGC -Djava.net.preferIPv4Stack=true Sencerely get your help , Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/How-to-solve-this-core-dump-error-tp18672.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org