If its running the thrift server from hive, it's got a SQL API for you to 
connect to...

On 3 Sep 2015, at 17:03, Dhaval Patel 
<dhaval1...@gmail.com<mailto:dhaval1...@gmail.com>> wrote:

I am accessing a shared cluster mode Spark environment. However, there is an 
existing application (SparkSQL/Thrift Server), running under a different user, 
that occupies all available cores. Please see attached screenshot to get an 
idea about current resource utilization.

Is there a way I can use this application to submit my jobs (under different 
user than mapr) without restarting this app with reduced number of  cores 
(spark.deploy.defaultCores)?

What could be optimal solution for this type of resource sharing issues? Fari 
Scheduler or any other approach? - 
http://spark.apache.org/docs/latest/job-scheduling.html#scheduling-within-an-application


<image.png>
\<image.png>

Reply via email to