[ 
https://issues.apache.org/jira/browse/SPARK-18405?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15696978#comment-15696978
 ] 

Jeff Zhang edited comment on SPARK-18405 at 11/26/16 1:01 AM:
--------------------------------------------------------------

I think he mean to launch multiple spark thrift server in yarn-cluster mode so 
each we can evenly distribute workload across them(e.g. one spark thrift server 
per department).  And it is much easier to request large container for AM in 
yarn-cluster while the the memory of the host in yarn-client mode may be 
limited and without capacity control under yarn. 


was (Author: zjffdu):
I think he mean to launch multiple spark thrift server in yarn-cluster mode so 
each we can evenly distribute workload across them.  And it is much easier to 
request large container for AM in yarn-cluster while the the memory of the host 
in yarn-client mode may be limited and without capacity control under yarn. 

> Add yarn-cluster mode support to Spark Thrift Server
> ----------------------------------------------------
>
>                 Key: SPARK-18405
>                 URL: https://issues.apache.org/jira/browse/SPARK-18405
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>    Affects Versions: 1.6.0, 1.6.2, 2.0.0, 2.0.1
>            Reporter: Prabhu Kasinathan
>              Labels: Spark, ThriftServer2
>
> Currently, spark thrift server can run only on yarn-client mode. 
> Can we add Yarn-Cluster mode support to spark thrift server?
> This will help us to launch multiple spark thrift server with different spark 
> configurations and it really help in large distributed clusters where there 
> is requirement to run complex sqls through STS. With client mode, there is a 
> chance to overload local host with too much driver memory. 
> Please let me know your thoughts.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to