How to distribute Spark computation recipes

2015-04-27 Thread Olivier Girardot
Hi everyone, I know that any RDD is related to its SparkContext and the associated variables (broadcast, accumulators), but I'm looking for a way to serialize/deserialize full RDD computations ? @rxin Spark SQL is, in a way, already doing this but the parsers are private[sql], is there any way to

Re: How to distribute Spark computation recipes

2015-04-27 Thread Reynold Xin
The code themselves are the recipies, no? On Mon, Apr 27, 2015 at 2:49 AM, Olivier Girardot o.girar...@lateral-thoughts.com wrote: Hi everyone, I know that any RDD is related to its SparkContext and the associated variables (broadcast, accumulators), but I'm looking for a way to