Hi, guys i want to do some optimizations of my spark codes. i use VisualVM to monitor the executor when run the app. here's the snapshot: <http://apache-spark-user-list.1001560.n3.nabble.com/file/n5107/executor.png>
from the snapshot, i can get the memory usage information about the executor, but the executor contains lots of tasks. is it possible to get the memory usage of one single task in JVM with GC running in the background? by the way, you can see every time when memory is consumed up to 90%, JVM does GC operation. i'am a little confused about that. i originally thought that 60% of the memory is kept for Spark's memory cache(i did not cache any RDDs in my application), so there was only 40% left for running the app. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/something-about-memory-usage-tp5107.html Sent from the Apache Spark User List mailing list archive at Nabble.com.