Hi All,

I am trying to run a spark job using yarn, and i specify --executor-cores
value as 20.
But when i go check the "nodes of the cluster" page in
http://hostname:8088/cluster/nodes then i see 4 containers getting created
on each of the node in cluster.

But can only see 1 vcore getting assigned for each containier, even when i
specify --executor-cores 20 while submitting job using spark-submit.

yarn-site.xml
<property>
        <name>yarn.scheduler.maximum-allocation-mb</name>
        <value>60000</value>
</property>
<property>
        <name>yarn.scheduler.minimum-allocation-vcores</name>
        <value>1</value>
</property>
<property>
        <name>yarn.scheduler.maximum-allocation-vcores</name>
        <value>40</value>
</property>
<property>
        <name>yarn.nodemanager.resource.memory-mb</name>
        <value>70000</value>
</property>
<property>
        <name>yarn.nodemanager.resource.cpu-vcores</name>
        <value>20</value>
</property>


Did anyone face the same issue??

Regards,
Satyajit.

Reply via email to