Hello, Currently am exploring Spark so as to use it as a caching framework. Basically, the modules require access to local as well as global cache and so was looking of ways of sharing RDDs across programs.
Came across this discussion, which says currently we cannot share RDDs across spark programs but there is some work going on for this. https://groups.google.com/forum/#!searchin/spark-users/cache/spark-users/hXsivEoZiyQ/WUZhdXZgxWsJ Would like to see possibly by when that module would be available for us to use. Thanks, Shalini
