Hello,
I would like to share RDD between an application and sparkR.
I understand we have job-server and IBM kernel for sharing the context for
different applications but not sure how we can use it with sparkR as it is
some sort of front end (R shell) with spark.
Any insights appreciated.

Hari





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Share-RDD-from-SparkR-and-another-application-tp23795.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to