Re: Resource allocation issue - is it possible to submit a new job in existing application under a different user?

2015-09-03 Thread Steve Loughran
If its running the thrift server from hive, it's got a SQL API for you to 
connect to...

On 3 Sep 2015, at 17:03, Dhaval Patel 
> wrote:

I am accessing a shared cluster mode Spark environment. However, there is an 
existing application (SparkSQL/Thrift Server), running under a different user, 
that occupies all available cores. Please see attached screenshot to get an 
idea about current resource utilization.

Is there a way I can use this application to submit my jobs (under different 
user than mapr) without restarting this app with reduced number of  cores 
(spark.deploy.defaultCores)?

What could be optimal solution for this type of resource sharing issues? Fari 
Scheduler or any other approach? - 
http://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application



\



Resource allocation issue - is it possible to submit a new job in existing application under a different user?

2015-09-03 Thread Dhaval Patel
I am accessing a shared cluster mode Spark environment. However, there is
an existing application (SparkSQL/Thrift Server), running under a different
user, that occupies all available cores. Please see attached screenshot to
get an idea about current resource utilization.

Is there a way I can use this application to submit my jobs (under
different user than mapr) without restarting this app with reduced number
of  cores (spark.deploy.defaultCores)?

What could be optimal solution for this type of resource sharing issues?
Fari Scheduler or any other approach? -
http://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application


[image: Inline image 1]
\[image: Inline image 3]


Re: Resource allocation issue - is it possible to submit a new job in existing application under a different user?

2015-09-03 Thread Dhaval Patel
Yes you're right and I can connect it through Tableau etc. tools but don't
know how I can connect from shell where I can submit more jobs to this
application.

Any insight on how can I connect using shell?

On Thu, Sep 3, 2015 at 1:39 PM, Steve Loughran 
wrote:

> If its running the thrift server from hive, it's got a SQL API for you to
> connect to...
>
> On 3 Sep 2015, at 17:03, Dhaval Patel  wrote:
>
> I am accessing a shared cluster mode Spark environment. However, there is
> an existing application (SparkSQL/Thrift Server), running under a different
> user, that occupies all available cores. Please see attached screenshot to
> get an idea about current resource utilization.
>
> Is there a way I can use this application to submit my jobs (under
> different user than mapr) without restarting this app with reduced number
> of  cores (spark.deploy.defaultCores)?
>
> What could be optimal solution for this type of resource sharing issues?
> Fari Scheduler or any other approach? -
> http://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application
>
>
> 
> \
>
>
>