Hi All, Can spark be used as an alternative to gem fire cache ? we use gem fire cache to save (cache) dimension data in memory which is later used by our Java custom made ETL tool can I do something like below ?
can I cache a RDD in memory for a whole day ? as of I know RDD will get empty once the spark code finish executing (correct me if I am wrong). Spark:- create a RDD rdd.persistance Thanks -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/can-I-use-Spark-as-alternative-for-gem-fire-cache-tp25106.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org