Re:Re: Spark 2.0 on YARN - Dynamic Resource Allocation Behavior change?

2016-07-28 Thread LONG WANG
Thanks for your reply, I have tried your suggestion. it works now.

At 2016-07-28 16:18:01, "Sun Rui"  wrote:
Yes, this is a change in Spark 2.0.  you can take a look at 
https://issues.apache.org/jira/browse/SPARK-13723


In the latest Spark On Yarn documentation for Spark 2.0, there is updated 
description for --num-executors:
| spark.executor.instances | 2 | The number of executors for static allocation. 
Withspark.dynamicAllocation.enabled, the initial set of executors will be at 
least this large. |
You can disable the dynamic allocation for an application by specifying “--conf 
spark.dynamicAllocation.enabled=false” in the command line.


On Jul 28, 2016, at 15:44, LONG WANG  wrote:


Hi Spark Experts,


  Today I tried Spark 2.0 on YARN and also enabled 
Dynamic Resource Allocation feature, I just find that no matter I specify 
--num-executor in spark-submit command or not, the Dynamic Resource Allocation 
is used, but I remember when I specify --num-executor option in spark-submit 
command in Spark 1.6, the Dynamic Resource Allocation feature will not be 
used/effect for that job. And I can see below log in Spark 1.6 .


<截图1.png>
   
  Is this a behavior change in Spark 2.0? And How can I 
disable Dynamic Resource Allocation for a specific job submission temporarily 
as before? 



 

 邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
共有 1 个附件
截图1.png(23K)极速下载 在线预览



Re: Spark 2.0 on YARN - Dynamic Resource Allocation Behavior change?

2016-07-28 Thread Sun Rui
Yes, this is a change in Spark 2.0.  you can take a look at 
https://issues.apache.org/jira/browse/SPARK-13723 


In the latest Spark On Yarn documentation 
 for Spark 2.0, there 
is updated description for --num-executors:
> spark.executor.instances  2   The number of executors for static 
> allocation. Withspark.dynamicAllocation.enabled, the initial set of executors 
> will be at least this large.
You can disable the dynamic allocation for an application by specifying “--conf 
spark.dynamicAllocation.enabled=false” in the command line.

> On Jul 28, 2016, at 15:44, LONG WANG  wrote:
> 
> Hi Spark Experts,
> 
>   Today I tried Spark 2.0 on YARN and also enabled 
> Dynamic Resource Allocation feature, I just find that no matter I specify 
> --num-executor in spark-submit command or not, the Dynamic Resource 
> Allocation is used, but I remember when I specify --num-executor option in 
> spark-submit command in Spark 1.6, the Dynamic Resource Allocation feature 
> will not be used/effect for that job. And I can see below log in Spark 1.6 .
> 
> <截图1.png>
>
>   Is this a behavior change in Spark 2.0? And How can 
> I disable Dynamic Resource Allocation for a specific job submission 
> temporarily as before? 
> 
> 
>  
>  邮件带有附件预览链接,若您转发或回复此邮件时不希望对方预览附件,建议您手动删除链接。
> 共有 1 个附件
> 截图1.png(23K)
> 极速下载 
> 
>  在线预览 
> 


Spark 2.0 on YARN - Dynamic Resource Allocation Behavior change?

2016-07-28 Thread LONG WANG
Hi Spark Experts,


  Today I tried Spark 2.0 on YARN and also enabled 
Dynamic Resource Allocation feature, I just find that no matter I specify 
--num-executor in spark-submit command or not, the Dynamic Resource Allocation 
is used, but I remember when I specify --num-executor option in spark-submit 
command in Spark 1.6, the Dynamic Resource Allocation feature will not be 
used/effect for that job. And I can see below log in Spark 1.6 .



   
  Is this a behavior change in Spark 2.0? And How can I 
disable Dynamic Resource Allocation for a specific job submission temporarily 
as before?