Hi,

Which version of Spark you use?
The recent one cannot handle this kind of spilling, see:
http://spark.apache.org/docs/latest/tuning.html#memory-management-overview.

// maropu

On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar <ashok34...@yahoo.com.invalid>
wrote:

> Hi,
>
> How one can avoid having Spark spill over after filling the node's memory.
>
> Thanks
>
>
>
>


-- 
---
Takeshi Yamamuro

Reply via email to