Re: How to use more executors

2015-03-11 Thread Nan Zhu
at least 1.4 I think  

now using YARN or allowing multiple worker instances are just fine

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, March 11, 2015 at 8:42 PM, Du Li wrote:

 Is it being merged in the next release? It's indeed a critical patch!
  
 Du  
  
  
 On Wednesday, January 21, 2015 3:59 PM, Nan Zhu zhunanmcg...@gmail.com 
 (mailto:zhunanmcg...@gmail.com) wrote:
  
  
 …not sure when will it be reviewed…
  
 but for now you can work around by allowing multiple worker instances on a 
 single machine  
  
 http://spark.apache.org/docs/latest/spark-standalone.html
  
 search SPARK_WORKER_INSTANCES
  
 Best,  
  
 --  
 Nan Zhu
 http://codingcat.me
  
 On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:
  Will  SPARK-1706 be included in next release?
   
  On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com 
  (mailto:yuzhih...@gmail.com) wrote:
   Please see SPARK-1706

   On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com 
   (mailto:larryli...@gmail.com) wrote:
I tried to submit a job with  --conf spark.cores.max=6  or 
--total-executor-cores 6 on a standalone cluster. But I don't see more 
than 1 executor on each worker. I am wondering how to use multiple 
executors when submitting jobs.
 
Thanks
larry
 
 



   
  
  
  



Re: How to use more executors

2015-03-11 Thread Du Li
Is it possible to extend this PR further (or create another PR) to allow for 
per-node configuration of workers? 
There are many discussions about heterogeneous spark cluster. Currently 
configuration on master will override those on the workers. Many spark users 
have the need for having machines with different cpu/memory capacities in the 
same cluster.
Du 

 On Wednesday, January 21, 2015 3:59 PM, Nan Zhu zhunanmcg...@gmail.com 
wrote:
   

 …not sure when will it be reviewed…
but for now you can work around by allowing multiple worker instances on a 
single machine 
http://spark.apache.org/docs/latest/spark-standalone.html
search SPARK_WORKER_INSTANCES
Best, 
-- Nan Zhuhttp://codingcat.me On Wednesday, January 21, 2015 at 6:50 PM, Larry 
Liu wrote: 
 Will  SPARK-1706 be included in next release?
On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com wrote:

Please see SPARK-1706
On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com wrote:

I tried to submit a job with  --conf spark.cores.max=6  or 
--total-executor-cores 6 on a standalone cluster. But I don't see more than 1 
executor on each worker. I am wondering how to use multiple executors when 
submitting jobs.
Thankslarry



 
  
 

   

Re: How to use more executors

2015-03-11 Thread Du Li
Is it being merged in the next release? It's indeed a critical patch!
Du 

 On Wednesday, January 21, 2015 3:59 PM, Nan Zhu zhunanmcg...@gmail.com 
wrote:
   

 …not sure when will it be reviewed…
but for now you can work around by allowing multiple worker instances on a 
single machine 
http://spark.apache.org/docs/latest/spark-standalone.html
search SPARK_WORKER_INSTANCES
Best, 
-- Nan Zhuhttp://codingcat.me On Wednesday, January 21, 2015 at 6:50 PM, Larry 
Liu wrote: 
 Will  SPARK-1706 be included in next release?
On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com wrote:

Please see SPARK-1706
On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com wrote:

I tried to submit a job with  --conf spark.cores.max=6  or 
--total-executor-cores 6 on a standalone cluster. But I don't see more than 1 
executor on each worker. I am wondering how to use multiple executors when 
submitting jobs.
Thankslarry



 
  
 

   

Re: How to use more executors

2015-03-11 Thread Nan Zhu
I think this should go to another PR

can you create a JIRA on that?

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, March 11, 2015 at 8:50 PM, Du Li wrote:

 Is it possible to extend this PR further (or create another PR) to allow for 
 per-node configuration of workers?  
  
 There are many discussions about heterogeneous spark cluster. Currently 
 configuration on master will override those on the workers. Many spark users 
 have the need for having machines with different cpu/memory capacities in the 
 same cluster.
  
 Du  
  
  
 On Wednesday, January 21, 2015 3:59 PM, Nan Zhu zhunanmcg...@gmail.com 
 (mailto:zhunanmcg...@gmail.com) wrote:
  
  
 …not sure when will it be reviewed…
  
 but for now you can work around by allowing multiple worker instances on a 
 single machine  
  
 http://spark.apache.org/docs/latest/spark-standalone.html
  
 search SPARK_WORKER_INSTANCES
  
 Best,  
  
 --  
 Nan Zhu
 http://codingcat.me
  
 On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:
  Will  SPARK-1706 be included in next release?
   
  On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com 
  (mailto:yuzhih...@gmail.com) wrote:
   Please see SPARK-1706

   On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com 
   (mailto:larryli...@gmail.com) wrote:
I tried to submit a job with  --conf spark.cores.max=6  or 
--total-executor-cores 6 on a standalone cluster. But I don't see more 
than 1 executor on each worker. I am wondering how to use multiple 
executors when submitting jobs.
 
Thanks
larry
 
 



   
  
  
  



Re: How to use more executors

2015-01-21 Thread Nan Zhu
…not sure when will it be reviewed…

but for now you can work around by allowing multiple worker instances on a 
single machine  

http://spark.apache.org/docs/latest/spark-standalone.html

search SPARK_WORKER_INSTANCES

Best,  

--  
Nan Zhu
http://codingcat.me


On Wednesday, January 21, 2015 at 6:50 PM, Larry Liu wrote:

 Will  SPARK-1706 be included in next release?
  
 On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com 
 (mailto:yuzhih...@gmail.com) wrote:
  Please see SPARK-1706
   
  On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com 
  (mailto:larryli...@gmail.com) wrote:
   I tried to submit a job with  --conf spark.cores.max=6  or 
   --total-executor-cores 6 on a standalone cluster. But I don't see more 
   than 1 executor on each worker. I am wondering how to use multiple 
   executors when submitting jobs.

   Thanks
   larry


   
   
   
  



Re: How to use more executors

2015-01-21 Thread Larry Liu
Will  SPARK-1706 be included in next release?

On Wed, Jan 21, 2015 at 2:50 PM, Ted Yu yuzhih...@gmail.com wrote:

 Please see SPARK-1706

 On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com wrote:

 I tried to submit a job with  --conf spark.cores.max=6
  or --total-executor-cores 6 on a standalone cluster. But I don't see more
 than 1 executor on each worker. I am wondering how to use multiple
 executors when submitting jobs.

 Thanks
 larry





Re: How to use more executors

2015-01-21 Thread Ted Yu
Please see SPARK-1706

On Wed, Jan 21, 2015 at 2:43 PM, Larry Liu larryli...@gmail.com wrote:

 I tried to submit a job with  --conf spark.cores.max=6
  or --total-executor-cores 6 on a standalone cluster. But I don't see more
 than 1 executor on each worker. I am wondering how to use multiple
 executors when submitting jobs.

 Thanks
 larry



How to use more executors

2015-01-21 Thread Larry Liu
I tried to submit a job with  --conf spark.cores.max=6
 or --total-executor-cores 6 on a standalone cluster. But I don't see more
than 1 executor on each worker. I am wondering how to use multiple
executors when submitting jobs.

Thanks
larry