persist @ disk-only failing

2014-05-19 Thread Sai Prasanna
Hi all, When i gave the persist level as DISK_ONLY, still Spark tries to use memory and caches. Any reason ? Do i need to override some parameter elsewhere ? Thanks !

Re: persist @ disk-only failing

2014-05-19 Thread Matei Zaharia
This is the patch for it: https://github.com/apache/spark/pull/50/. It might be possible to backport it to 0.8. Matei On May 19, 2014, at 2:04 AM, Sai Prasanna ansaiprasa...@gmail.com wrote: Matei, I am using 0.8.1 !! But is there a way without moving to 0.9.1 to bypass cache ? On