Hi John,
This would be a potential application for the Spark Kernel project (
https://github.com/ibm-et/spark-kernel). The Spark Kernel serves as your
driver application, allowing you to feed it snippets of code (or load up
entire jars via magics) in Scala to execute against a Spark cluster.
I have been posting on the Mesos list, as I am looking to see if it
it's possible or not to share spark drivers. Obviously, in stand
alone cluster mode, the Master handles requests, and you can
instantiate a new sparkcontext to a currently running master. However
in Mesos (and perhaps Yarn) I
I am aware of that, but two things are working against me here with
spark-kernel. Python is our language, and we are really looking for a
supported way to approach this for the enterprise. I like the
concept, it just doesn't work for us given our constraints.
This does raise an interesting point