The JDBC server is what you are looking for:
http://spark.apache.org/docs/latest/sql-programming-guide.html#running-the-thrift-jdbc-server

On Wed, Oct 22, 2014 at 11:10 AM, Sadhan Sood <sadhan.s...@gmail.com> wrote:

> We want to run multiple instances of spark sql cli on our yarn cluster.
> Each instance of the cli is to be used by a different user. This looks
> non-optimal if each user brings up a different cli given how spark works on
> yarn by running executor processes (and hence consuming resources) on
> worker nodes for the lifetime of the application. So, the right way seems
> like to use the same spark context shared across multiple initializations
> and running just one spark sql application. Is the understanding correct ?
> Is there a way to do it currently ? Seem like it needs some kind of thrift
> interface hooked into the cli driver.
>

Reply via email to