[ 
https://issues.apache.org/jira/browse/SPARK-15822?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15331580#comment-15331580
 ] 

Pete Robbins commented on SPARK-15822:
--------------------------------------

I can also recreate this issue on Oracle JDK 1.8:

{noformat}
#
# A fatal error has been detected by the Java Runtime Environment:
#
#  SIGSEGV (0xb) at pc=0x00007f0c65d06aec, pid=7521, tid=0x00007f0b69ffd700
#
# JRE version: Java(TM) SE Runtime Environment (8.0_92-b14) (build 1.8.0_92-b14)
# Java VM: Java HotSpot(TM) 64-Bit Server VM (25.92-b14 mixed mode linux-amd64 
compressed oops)
# Problematic frame:
# J 7453 C1 org.apache.spark.unsafe.Platform.getByte(Ljava/lang/Object;J)B (9 
bytes) @ 0x00007f0c65d06aec [0x00007f0c65d06ae0+0xc]
#
# Failed to write core dump. Core dumps have been disabled. To enable core 
dumping, try "ulimit -c unlimited" before starting Java again
#
# If you would like to submit a bug report, please visit:
#   http://bugreport.java.com/bugreport/crash.jsp
#

---------------  T H R E A D  ---------------

Current thread (0x00007f0bf4008800):  JavaThread "Executor task launch 
worker-3" daemon [_thread_in_Java, id=7662, 
stack(0x00007f0b69efd000,0x00007f0b69ffe000)]

siginfo: si_signo: 11 (SIGSEGV), si_code: 1 (SEGV_MAPERR), si_addr: 
0x0000000002868e54

Registers:
RAX=0x00007f0c461abb38, RBX=0x00007f0c461abb38, RCX=0x00007f0c213547c8, 
RDX=0x0000000002868e54
RSP=0x00007f0b69ffba40, RBP=0x00007f0b69ffbae0, RSI=0x0000000000000000, 
RDI=0x00000001008254d8
R8 =0x00000000200bd0a6, R9 =0x00000000d9fa2650, R10=0x00007f0c79d39020, 
R11=0x00007f0c65d06ae0
R12=0x0000000000000000, R13=0x00007f0b69ffba88, R14=0x00007f0b69ffbaf8, 
R15=0x00007f0bf4008800
RIP=0x00007f0c65d06aec, EFLAGS=0x0000000000010202, CSGSFS=0x0000000000000033, 
ERR=0x0000000000000004
  TRAPNO=0x000000000000000e

Top of Stack: (sp=0x00007f0b69ffba40)
0x00007f0b69ffba40:   00007f0b684b4a70 0000000000000000
0x00007f0b69ffba50:   00007f0b69ffbb10 00007f0c65e96d4c
0x00007f0b69ffba60:   00007f0c65008040 00000000d9fa2628
0x00007f0b69ffba70:   00007f0b69ffbae0 00007f0c650079c0
0x00007f0b69ffba80:   00007f0c650079c0 0000000002868e54
0x00007f0b69ffba90:   0000000000000030 0000000000000000
0x00007f0b69ffbaa0:   00007f0b69ffbaa0 00007f0c21351403
0x00007f0b69ffbab0:   00007f0b69ffbaf8 00007f0c213547c8
0x00007f0b69ffbac0:   0000000000000000 00007f0c21351428
0x00007f0b69ffbad0:   00007f0b69ffba88 00007f0b69ffbaf0
0x00007f0b69ffbae0:   00007f0b69ffbb48 00007f0c650079c0
0x00007f0b69ffbaf0:   0000000000000000 00000000d9f57cf0
0x00007f0b69ffbb00:   000000000000004c 00007f0b69ffbb08
0x00007f0b69ffbb10:   00007f0c21353726 00007f0b69ffbb78
0x00007f0b69ffbb20:   00007f0c213547c8 0000000000000000
0x00007f0b69ffbb30:   00007f0c213537a0 00007f0b69ffbaf0
0x00007f0b69ffbb40:   00007f0b69ffbb70 00007f0b69ffbbc0
0x00007f0b69ffbb50:   00007f0c65007d00 0000000000000000
0x00007f0b69ffbb60:   0000000000000000 0000000000000003
0x00007f0b69ffbb70:   00000000d9f57cf0 00000000d9fa33b0
0x00007f0b69ffbb80:   00007f0b69ffbb80 00007f0c2135385a
0x00007f0b69ffbb90:   00007f0b69ffbbd8 00007f0c213547c8
0x00007f0b69ffbba0:   0000000000000000 00007f0c21353880
0x00007f0b69ffbbb0:   00007f0b69ffbb70 00007f0b69ffbbd0
0x00007f0b69ffbbc0:   00007f0b69ffbc20 00007f0c65007d00
0x00007f0b69ffbbd0:   00000000d9f57cf0 00000000d9fa33b0
0x00007f0b69ffbbe0:   00007f0b69ffbbe0 00007f0b684a24e5
0x00007f0b69ffbbf0:   00007f0b69ffbc88 00007f0b684a2950
0x00007f0b69ffbc00:   0000000000000000 00007f0b684a2618
0x00007f0b69ffbc10:   00007f0b69ffbbd0 00007f0b69ffbc78
0x00007f0b69ffbc20:   00007f0b69ffbcd0 00007f0c65007a90
0x00007f0b69ffbc30:   0000000000000000 0000000000000000 

Instructions: (pc=0x00007f0c65d06aec)
0x00007f0c65d06acc:   0a 80 11 64 01 f8 12 fe 06 90 0c 64 01 f8 12 fe
0x00007f0c65d06adc:   06 90 0c 64 89 84 24 00 c0 fe ff 55 48 83 ec 30
0x00007f0c65d06aec:   0f be 04 16 c1 e0 18 c1 f8 18 48 83 c4 30 5d 85
0x00007f0c65d06afc:   05 ff f5 28 14 c3 90 90 49 8b 87 a8 02 00 00 49 

Register to memory mapping:

RAX={method} {0x00007f0c461abb38} 'getByte' '(Ljava/lang/Object;J)B' in 
'org/apache/spark/unsafe/Platform'
RBX={method} {0x00007f0c461abb38} 'getByte' '(Ljava/lang/Object;J)B' in 
'org/apache/spark/unsafe/Platform'
RCX=0x00007f0c213547c8 is pointing into metadata
RDX=0x0000000002868e54 is an unknown value
RSP=0x00007f0b69ffba40 is pointing into the stack for thread: 0x00007f0bf4008800
RBP=0x00007f0b69ffbae0 is pointing into the stack for thread: 0x00007f0bf4008800
RSI=0x0000000000000000 is an unknown value
RDI=0x00000001008254d8 is pointing into metadata
R8 =0x00000000200bd0a6 is an unknown value
R9 =0x00000000d9fa2650 is an oop
[B 
 - klass: {type array byte}
 - length: 48
R10=0x00007f0c79d39020: <offset 0xfbc020> in 
/home/robbins/sdks/jdk1.8.0_92/jre/lib/amd64/server/libjvm.so at 
0x00007f0c78d7d000
R11=0x00007f0c65d06ae0 is at entry_point+0 in (nmethod*)0x00007f0c65d06990
R12=0x0000000000000000 is an unknown value
R13=0x00007f0b69ffba88 is pointing into the stack for thread: 0x00007f0bf4008800
R14=0x00007f0b69ffbaf8 is pointing into the stack for thread: 0x00007f0bf4008800
R15=0x00007f0bf4008800 is a thread


Stack: [0x00007f0b69efd000,0x00007f0b69ffe000],  sp=0x00007f0b69ffba40,  free 
space=1018k
Native frames: (J=compiled Java code, j=interpreted, Vv=VM code, C=native code)
J 7453 C1 org.apache.spark.unsafe.Platform.getByte(Ljava/lang/Object;J)B (9 
bytes) @ 0x00007f0c65d06aec [0x00007f0c65d06ae0+0xc]
j  org.apache.spark.unsafe.types.UTF8String.getByte(I)B+11
j  
org.apache.spark.unsafe.types.UTF8String.compareTo(Lorg/apache/spark/unsafe/types/UTF8String;)I+30
j  
org.apache.spark.unsafe.types.UTF8String.compare(Lorg/apache/spark/unsafe/types/UTF8String;)I+2
j  
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.findNextInnerJoinRows$(Lorg/apache/spark/sql/catalyst/expressions/GeneratedClass$GeneratedIterator;Lscala/collection/Iterator;Lscala/collection/Iterator;)Z+141
j  
org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext()V+410
J 7795 C1 org.apache.spark.sql.execution.BufferedRowIterator.hasNext()Z (30 
bytes) @ 0x00007f0c655268d4 [0x00007f0c65526660+0x274]
j  
org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$doExecute$3$$anon$2.hasNext()Z+4
J 5092 C2 scala.collection.Iterator$$anon$11.hasNext()Z (10 bytes) @ 
0x00007f0c65c5f904 [0x00007f0c65c5f8a0+0x64]
j  scala.collection.convert.Wrappers$IteratorWrapper.hasNext()Z+4
j  
org.spark_project.guava.collect.Ordering.leastOf(Ljava/util/Iterator;I)Ljava/util/List;+132
j  
org.apache.spark.util.collection.Utils$.takeOrdered(Lscala/collection/Iterator;ILscala/math/Ordering;)Lscala/collection/Iterator;+29
j  
org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(Lscala/collection/Iterator;)Lscala/collection/Iterator;+46
j  
org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(Ljava/lang/Object;)Ljava/lang/Object;+5
j  
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(Lorg/apache/spark/TaskContext;ILscala/collection/Iterator;)Lscala/collection/Iterator;+5
j  
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(Ljava/lang/Object;Ljava/lang/Object;Ljava/lang/Object;)Ljava/lang/Object;+13
j  
org.apache.spark.rdd.MapPartitionsRDD.compute(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+27
j  
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+26
j  
org.apache.spark.rdd.RDD.iterator(Lorg/apache/spark/Partition;Lorg/apache/spark/TaskContext;)Lscala/collection/Iterator;+33
j  
org.apache.spark.scheduler.ResultTask.runTask(Lorg/apache/spark/TaskContext;)Ljava/lang/Object;+136
j  
org.apache.spark.scheduler.Task.run(JILorg/apache/spark/metrics/MetricsSystem;)Ljava/lang/Object;+82
j  org.apache.spark.executor.Executor$TaskRunner.run()V+374
j  
java.util.concurrent.ThreadPoolExecutor.runWorker(Ljava/util/concurrent/ThreadPoolExecutor$Worker;)V+95
j  java.util.concurrent.ThreadPoolExecutor$Worker.run()V+5
j  java.lang.Thread.run()V+11
v  ~StubRoutines::call_stub
V  [libjvm.so+0x68e746]  JavaCalls::call_helper(JavaValue*, methodHandle*, 
JavaCallArguments*, Thread*)+0x1056
V  [libjvm.so+0x68ec51]  JavaCalls::call_virtual(JavaValue*, KlassHandle, 
Symbol*, Symbol*, JavaCallArguments*, Thread*)+0x321
V  [libjvm.so+0x68f0f7]  JavaCalls::call_virtual(JavaValue*, Handle, 
KlassHandle, Symbol*, Symbol*, Thread*)+0x47
V  [libjvm.so+0x726060]  thread_entry(JavaThread*, Thread*)+0xa0
V  [libjvm.so+0xa6cc5f]  JavaThread::thread_main_inner()+0xdf
V  [libjvm.so+0xa6cd8c]  JavaThread::run()+0x11c
V  [libjvm.so+0x91fad8]  java_start(Thread*)+0x108
C  [libpthread.so.0+0x7aa1]


{noformat}

> segmentation violation in o.a.s.unsafe.types.UTF8String 
> --------------------------------------------------------
>
>                 Key: SPARK-15822
>                 URL: https://issues.apache.org/jira/browse/SPARK-15822
>             Project: Spark
>          Issue Type: Bug
>    Affects Versions: 2.0.0
>         Environment: linux amd64
> openjdk version "1.8.0_91"
> OpenJDK Runtime Environment (build 1.8.0_91-b14)
> OpenJDK 64-Bit Server VM (build 25.91-b14, mixed mode)
>            Reporter: Pete Robbins
>            Assignee: Herman van Hovell
>            Priority: Blocker
>
> Executors fail with segmentation violation while running application with
> spark.memory.offHeap.enabled true
> spark.memory.offHeap.size 512m
> Also now reproduced with 
> spark.memory.offHeap.enabled false
> {noformat}
> #
> # A fatal error has been detected by the Java Runtime Environment:
> #
> #  SIGSEGV (0xb) at pc=0x00007f4559b4d4bd, pid=14182, tid=139935319750400
> #
> # JRE version: OpenJDK Runtime Environment (8.0_91-b14) (build 1.8.0_91-b14)
> # Java VM: OpenJDK 64-Bit Server VM (25.91-b14 mixed mode linux-amd64 
> compressed oops)
> # Problematic frame:
> # J 4816 C2 
> org.apache.spark.unsafe.types.UTF8String.compareTo(Lorg/apache/spark/unsafe/types/UTF8String;)I
>  (64 bytes) @ 0x00007f4559b4d4bd [0x00007f4559b4d460+0x5d]
> {noformat}
> We initially saw this on IBM java on PowerPC box but is recreatable on linux 
> with OpenJDK. On linux with IBM Java 8 we see a null pointer exception at the 
> same code point:
> {noformat}
> 16/06/08 11:14:58 ERROR Executor: Exception in task 1.0 in stage 5.0 (TID 48)
> java.lang.NullPointerException
>       at 
> org.apache.spark.unsafe.types.UTF8String.compareTo(UTF8String.java:831)
>       at org.apache.spark.unsafe.types.UTF8String.compare(UTF8String.java:844)
>       at 
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.findNextInnerJoinRows$(Unknown
>  Source)
>       at 
> org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIterator.processNext(Unknown
>  Source)
>       at 
> org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43)
>       at 
> org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$doExecute$2$$anon$2.hasNext(WholeStageCodegenExec.scala:377)
>       at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:408)
>       at 
> scala.collection.convert.Wrappers$IteratorWrapper.hasNext(Wrappers.scala:30)
>       at org.spark_project.guava.collect.Ordering.leastOf(Ordering.java:664)
>       at org.apache.spark.util.collection.Utils$.takeOrdered(Utils.scala:37)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1365)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$takeOrdered$1$$anonfun$30.apply(RDD.scala:1362)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:757)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:757)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:318)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:282)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
>       at org.apache.spark.scheduler.Task.run(Task.scala:85)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1153)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628)
>       at java.lang.Thread.run(Thread.java:785)
> {noformat}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to