hi, there, I feel a little confused about the *cache* in spark.

first, is there any way to *customize the cached RDD name*, it's not
convenient for me when looking at the storage page, there are the kind of
RDD in the RDD Name column, I hope to make it as my customized name, kinds
of 'rdd 1', 'rrd of map', 'rdd of groupby' and so on.

second, can some one tell me what exactly the '*Fraction Cached*' mean
under the hood?

great thanks



​

-- 
*--------------------------------------*
a spark lover, a quant, a developer and a good man.

http://github.com/litaotao

Reply via email to