Hi,

Reading though the latest documentation for Memory management I can see that 
the parameter spark.memory.offHeap.enabled (true by default) is described with 
‘If true, Spark will attempt to use off-heap memory for certain operations’ [1].

Can you please describe the certain operations you are referring to?  

http://spark.apache.org/docs/latest/configuration.html#memory-management 
<http://spark.apache.org/docs/latest/configuration.html#memory-management>

Thank!

Best,
Ovidiu

Reply via email to