You mean share a single spark context across multiple jobs? https://github.com/spark-jobserver/spark-jobserver does the same
On Mon, Dec 5, 2016 at 9:33 AM, Mich Talebzadeh <mich.talebza...@gmail.com> wrote: > Hi, > > Has there been any experience using Livy with Spark to share multiple > Spark contexts? > > thanks > > > Dr Mich Talebzadeh > > > > LinkedIn * > https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw > <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* > > > > http://talebzadehmich.wordpress.com > > > *Disclaimer:* Use it at your own risk. Any and all responsibility for any > loss, damage or destruction of data or any other property which may arise > from relying on this email's technical content is explicitly disclaimed. > The author will in no case be liable for any monetary damages arising from > such loss, damage or destruction. > > >