Re: Spark handling spill overs

2016-05-13 Thread Mich Talebzadeh
Spill-overs are a common issue for in-memory computing systems, after all
memory is limited. In Spark where RDDs are immutable, if an RDD got created
with its size > 1/2 node's RAM then a transformation and generation of the
consequent RDD' can potentially fill all the node's memory that can  cause
the spill-over into swap space.

Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
*



http://talebzadehmich.wordpress.com



On 13 May 2016 at 00:38, Takeshi Yamamuro  wrote:

> Hi,
>
> Which version of Spark you use?
> The recent one cannot handle this kind of spilling, see:
> http://spark.apache.org/docs/latest/tuning.html#memory-management-overview
> .
>
> // maropu
>
> On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar  > wrote:
>
>> Hi,
>>
>> How one can avoid having Spark spill over after filling the node's memory.
>>
>> Thanks
>>
>>
>>
>>
>
>
> --
> ---
> Takeshi Yamamuro
>


Re: Spark handling spill overs

2016-05-12 Thread Takeshi Yamamuro
Hi,

Which version of Spark you use?
The recent one cannot handle this kind of spilling, see:
http://spark.apache.org/docs/latest/tuning.html#memory-management-overview.

// maropu

On Fri, May 13, 2016 at 8:07 AM, Ashok Kumar 
wrote:

> Hi,
>
> How one can avoid having Spark spill over after filling the node's memory.
>
> Thanks
>
>
>
>


-- 
---
Takeshi Yamamuro


Spark handling spill overs

2016-05-12 Thread Ashok Kumar
Hi,
How one can avoid having Spark spill over after filling the node's memory.
Thanks