I have used Spark for several years and realize from recent chatter on this 
list that I don’t really understand how it uses memory.

Specifically is spark.executor.memory and spark.driver.memory taken from the 
JVM heap when does Spark take memory from JVM heap and when it is from off JVM 
heap.

Since spark.executor.memory and spark.driver.memory are job params, I have 
always assumed that the required memory was off-JVM-heap.  Or am I on the wrong 
track altogether?

Can someone point me to a discussion of this?

thanks

Reply via email to