So, Amol, Did you look at the heap dump?
Denis пн, 6 авг. 2018 г. в 18:46, Amol Zambare <amolazamb...@gmail.com>: > Hi Alex, > > Here is the full stack trace > > [INFO][tcp-disco-sock-reader-#130][TcpDiscoverySpi] Finished serving > remote node connection > [INFO][tcp-disco-sock-reader-#653][TcpDiscoverySpi] Started serving remote > node connection > [SEVERE][tcp-disco-sock-reader-#130][TcpDiscoverySpi] Runtime error caught > during grid runnable execution: Socket reader [id=313, > name=tcp-disco-sock-reader-#130, > nodeId=35a7ca47-3245-4f9f-8114-9b65c6d5e9bf] > java.lang.OutOfMemoryError: GC overhead limit exceeded > at java.util.Arrays.copyOf(Arrays.java:3332) > at > java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124) > at > java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:596) > at java.lang.StringBuilder.append(StringBuilder.java:190) > at > java.io.ObjectInputStream$BlockDataInputStream.readUTFSpan(ObjectInputStream.java:3450) > at > java.io.ObjectInputStream$BlockDataInputStream.readUTFBody(ObjectInputStream.java:3358) > at > java.io.ObjectInputStream$BlockDataInputStream.readUTF(ObjectInputStream.java:3170) > at > java.io.ObjectInputStream.readString(ObjectInputStream.java:1850) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1527) > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:423) > at > org.apache.ignite.internal.util.IgniteUtils.readMap(IgniteUtils.java:5146) > at > org.apache.ignite.spi.discovery.tcp.internal.TcpDiscoveryNode.readExternal(TcpDiscoveryNode.java:617) > at > java.io.ObjectInputStream.readExternalData(ObjectInputStream.java:2063) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2012) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1536) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2232) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2156) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2014) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1536) > at java.io.ObjectInputStream.readObject(ObjectInputStream.java:423) > at java.util.ArrayList.readObject(ArrayList.java:791) > at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source) > at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > at java.lang.reflect.Method.invoke(Method.java:498) > at > java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1058) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2123) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2014) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1536) > at > java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2232) > at > java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2156) > at > java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2014) > at > java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1536) > [INFO][tcp-disco-srvr-#3][TcpDiscoverySpi] TCP discovery accepted incoming > connection > [INFO][tcp-disco-srvr-#3][TcpDiscoverySpi] TCP discovery accepted incoming > connection > [INFO][tcp-disco-srvr-#3][TcpDiscoverySpi] TCP discovery spawning a new > thread for connection > [INFO][tcp-disco-srvr-#3][TcpDiscoverySpi] TCP discovery accepted incoming > connection > [INFO][tcp-disco-sock-reader-#654][TcpDiscoverySpi] Started serving remote > node connection > [INFO][tcp-disco-srvr-#3][TcpDiscoverySpi] TCP discovery spawning a new > thread for connection > > Thanks, > Amol > > > > On Sat, Aug 4, 2018 at 3:28 AM, Alex Plehanov <plehanov.a...@gmail.com> > wrote: > >> Offheap and heap memory regions are used for different purposes and can't >> replace each other. You can't get rid of OOME in heap by increasing offheap >> memory. >> Can you provide full exception stack trace? >> >> 2018-08-03 20:55 GMT+03:00 Amol Zambare <amolazamb...@gmail.com>: >> >>> Thanks Alex and Denis >>> >>> We have configured off heap memory to 100GB and we have 10 nodes ignite >>> cluster. However when we are running spark job we see following error in >>> the ignite logs. When we run the spark job heap utilization on most of the >>> ignite nodes is increasing significantly though we are using off heap >>> storage. We have set JVM heap size on each ignite node to 50GB. Please >>> suggest. >>> >>> java.lang.OutOfMemoryError: GC overhead limit exceeded >>> at java.util.Arrays.copyOf(Arrays.java:3332) >>> at >>> java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:124) >>> >>> >>> On Fri, Aug 3, 2018 at 4:16 AM, Alex Plehanov <plehanov.a...@gmail.com> >>> wrote: >>> >>>> "Non-heap memory ..." metrics in visor have nothing to do with >>>> offheap memory allocated for data regions. "Non-heap memory" returned >>>> by visor it's JVM managed memory regions other then heap used for internal >>>> JVM purposes (JIT compiler, etc., see [1]). Memory allocated in offheap by >>>> Ignite for data regions (via "unsafe") not included into this metrics. Some >>>> data region related metrics in visor were implemented in Ignite 2.4. >>>> >>>> [1] >>>> https://docs.oracle.com/javase/8/docs/api/java/lang/management/MemoryMXBean.html >>>> >>> >>> >> >