How many executors can I acquire in standalone mode ?

2015-05-26 Thread canan chen
In spark standalone mode, there will be one executor per worker. I am
wondering how many executor can I acquire when I submit app ? Is it greedy
mode (as many as I can acquire )?


Re: How many executors can I acquire in standalone mode ?

2015-05-26 Thread Arush Kharbanda
I believe you would be restricted by the number of cores you have in your
cluster. Having a worker running without a core is useless.

On Tue, May 26, 2015 at 3:04 PM, canan chen  wrote:

> In spark standalone mode, there will be one executor per worker. I am
> wondering how many executor can I acquire when I submit app ? Is it greedy
> mode (as many as I can acquire )?
>



-- 

[image: Sigmoid Analytics] 

*Arush Kharbanda* || Technical Teamlead

ar...@sigmoidanalytics.com || www.sigmoidanalytics.com


Re: How many executors can I acquire in standalone mode ?

2015-05-27 Thread canan chen
Thanks Arush.
My scenario is that In standalone mode, if I have one worker, when I start
spark-shell, there will be one executor launched. But if I have 2 workers,
there will be 2 executors launched, so I am wondering the mechanism of
executor allocation.
Is it possible to specify how many executors I want in the code ?

On Tue, May 26, 2015 at 5:57 PM, Arush Kharbanda  wrote:

> I believe you would be restricted by the number of cores you have in your
> cluster. Having a worker running without a core is useless.
>
> On Tue, May 26, 2015 at 3:04 PM, canan chen  wrote:
>
>> In spark standalone mode, there will be one executor per worker. I am
>> wondering how many executor can I acquire when I submit app ? Is it greedy
>> mode (as many as I can acquire )?
>>
>
>
>
> --
>
> [image: Sigmoid Analytics] 
>
> *Arush Kharbanda* || Technical Teamlead
>
> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>


Re: How many executors can I acquire in standalone mode ?

2015-05-27 Thread ayan guha
You can request number of cores and amount of memory for each executor.
On 27 May 2015 18:25, "canan chen"  wrote:

> Thanks Arush.
> My scenario is that In standalone mode, if I have one worker, when I start
> spark-shell, there will be one executor launched. But if I have 2 workers,
> there will be 2 executors launched, so I am wondering the mechanism of
> executor allocation.
> Is it possible to specify how many executors I want in the code ?
>
> On Tue, May 26, 2015 at 5:57 PM, Arush Kharbanda <
> ar...@sigmoidanalytics.com> wrote:
>
>> I believe you would be restricted by the number of cores you have in your
>> cluster. Having a worker running without a core is useless.
>>
>> On Tue, May 26, 2015 at 3:04 PM, canan chen  wrote:
>>
>>> In spark standalone mode, there will be one executor per worker. I am
>>> wondering how many executor can I acquire when I submit app ? Is it greedy
>>> mode (as many as I can acquire )?
>>>
>>
>>
>>
>> --
>>
>> [image: Sigmoid Analytics] 
>>
>> *Arush Kharbanda* || Technical Teamlead
>>
>> ar...@sigmoidanalytics.com || www.sigmoidanalytics.com
>>
>
>