Looks like there is no such capability yet. 

How would you specify which rdd's to compress ?

Thanks

> On Mar 15, 2016, at 4:03 AM, Nirav Patel <npa...@xactlycorp.com> wrote:
> 
> Hi,
> 
> I see that there's following spark config to compress an RDD.  My guess is it 
> will compress all RDDs of a given SparkContext, right?  If so, is there a way 
> to instruct spark context to only compress some rdd and leave others 
> uncompressed ?
> 
> Thanks
> 
> spark.rdd.compress    false   Whether to compress serialized RDD partitions 
> (e.g. forStorageLevel.MEMORY_ONLY_SER). Can save substantial space at the 
> cost of some extra CPU time.
> 
> 
> 
> 
> 
>         

Reply via email to