Hi John,

I'm not quite familiar how SparkSQL thrift servers are started, but in
general you can't share a Mesos driver with two different frameworks in
Spark. Each spark shell or spark submit creates a new framework that is
independently getting offers and using these resources from Mesos.

If you want your executors to be long running, then you will want to run it
in coarse grain mode which also keeps your cache as well.

Tim

On Tue, Jan 6, 2015 at 5:40 AM, John Omernik <j...@omernik.com> wrote:

> I have Spark 1.2 running nicely with both the SparkSQL thrift server
> and running it in iPython.
>
> My question is this. I am running on Mesos in fine grained mode, what
> is the appropriate way to manage the two instances? Should I run a
> Course grained mode for the Spark SQL Thrift Server so that RDDs can
> persist?  Should Run both as separate Spark instances in Fine Grain
> Mode (I'ld have to change the port on one of them)  Is there a way to
> have one spark driver server both things so I only use resources for
> one driver?   How would you run this in a production environment?
>
> Thanks!
>
> John
>

Reply via email to