rdd's which are no longer required will be removed from memory by spark
itself (which you can consider as lazy?).
Thanks
Best Regards
On Wed, Jul 1, 2015 at 7:48 PM, Jem Tucker jem.tuc...@gmail.com wrote:
Hi,
The current behavior of rdd.unpersist() appears to not be lazily executed
and
Hi,
After running some tests it appears the unpersist is called as soon as it
is reached, so any tasks using this rdd later on will have to re calculate
it. This is fine for simple programs but when an rdd is created within a
function and its reference is then lost but children of it continue to
You may pass an optional parameter (blocking = false) to make it lazy.
Thank you,
Ilya Ganelin
-Original Message-
From: Jem Tucker [jem.tuc...@gmail.commailto:jem.tuc...@gmail.com]
Sent: Thursday, July 02, 2015 04:06 AM Eastern Standard Time
To: Akhil Das
Cc: user
Subject: Re: Making