May be to naive to ask but How do I check that?
Sometimes there are almost 200 map tasks pending to run but at a time only
31 runs.

On Fri, Nov 6, 2015 at 5:57 PM, Chris Mawata <chris.maw...@gmail.com> wrote:

> Also check that you have more than 31 blocks to process.
> On Nov 6, 2015 6:54 AM, "sandeep das" <yarnhad...@gmail.com> wrote:
>
>> Hi Varun,
>>
>> I tried to increase this parameter but it did not increase number of
>> parallel tasks but if It is decreased then YARN reduces number of parallel
>> tasks. I'm bit puzzled why its not increasing more than 31 tasks even after
>> its value is increased.
>>
>> Is there any other configuration as well which controls on how many
>> maximum tasks can execute in parallel?
>>
>> Regards,
>> Sandeep
>>
>> On Tue, Nov 3, 2015 at 7:29 PM, Varun Vasudev <vvasu...@apache.org>
>> wrote:
>>
>>> The number of parallel tasks that are run depends on the amount of
>>> memory and vcores on your machines and the amount of memory and vcores
>>> required by your mappers and reducers. The amount of memory can be set
>>> via yarn.nodemanager.resource.memory-mb(the default is 8G). The amount of
>>> vcores can be set via yarn.nodemanager.resource.cpu-vcores(the default
>>> is 8 vcores).
>>>
>>> -Varun
>>>
>>> From: sandeep das <yarnhad...@gmail.com>
>>> Reply-To: <user@hadoop.apache.org>
>>> Date: Monday, November 2, 2015 at 3:56 PM
>>> To: <user@hadoop.apache.org>
>>> Subject: Max Parallel task executors
>>>
>>> Hi Team,
>>>
>>> I've a cloudera cluster of 4 nodes. Whenever i submit a job my only 31
>>> parallel tasks are executed whereas my machines have more CPU available but
>>> still YARN/AM does not create more task.
>>>
>>> Is there any configuration which I can change to start more MAP/REDUCER
>>> task in parallel?
>>>
>>> Each machine in my cluster has 24 CPUs.
>>>
>>> Regards,
>>> Sandeep
>>>
>>
>>

Reply via email to