The drive has a heuristic mechanism to decide the number of executors in
the run-time according the pending tasks. You could enable with
configuration, you could refer to spark document to find the details.

2015-05-27 15:00 GMT+08:00 canan chen <ccn...@gmail.com>:

> How does the dynamic allocation works ? I mean does it related
> with parallelism of my RDD and how does driver know how many executor it
> needs ?
>
> On Wed, May 27, 2015 at 2:49 PM, Saisai Shao <sai.sai.s...@gmail.com>
> wrote:
>
>> It depends on how you use Spark, if you use Spark with Yarn and enable
>> dynamic allocation, the number of executor is not fixed, will change
>> dynamically according to the load.
>>
>> Thanks
>> Jerry
>>
>> 2015-05-27 14:44 GMT+08:00 canan chen <ccn...@gmail.com>:
>>
>>> It seems the executor number is fixed for the standalone mode, not sure
>>> other modes.
>>>
>>
>>
>

Reply via email to