Hi,
Is it possible to use Spark as clustered key/value store ( say, like
redis-cluster or hazelcast)?Will it out perform in write/read or other
operation?
My main urge is to use same RDD from several different SparkContext without
saving to disk or using spark-job server,but I'm curious if someone has
already tried using Spark like key/value store.

Thanks,

Hajime

Reply via email to