Re: Sharing Spark Drivers

2015-02-24 Thread Chip Senkbeil
Hi John, This would be a potential application for the Spark Kernel project ( https://github.com/ibm-et/spark-kernel). The Spark Kernel serves as your driver application, allowing you to feed it snippets of code (or load up entire jars via magics) in Scala to execute against a Spark cluster.

Sharing Spark Drivers

2015-02-24 Thread John Omernik
I have been posting on the Mesos list, as I am looking to see if it it's possible or not to share spark drivers. Obviously, in stand alone cluster mode, the Master handles requests, and you can instantiate a new sparkcontext to a currently running master. However in Mesos (and perhaps Yarn) I

Re: Sharing Spark Drivers

2015-02-24 Thread John Omernik
I am aware of that, but two things are working against me here with spark-kernel. Python is our language, and we are really looking for a supported way to approach this for the enterprise. I like the concept, it just doesn't work for us given our constraints. This does raise an interesting point