>I am running a spark job with 20 cores but i did not understand why my
application get 1-2 cores on couple of machines why not it just run on two
nodes like node1=16 cores and node 2=4 cores . but cores are allocated like
node1=2 node =1-node 14=1 like that.
I believe that's the intended
Hello,
Below is my understanding.
The default configuration parameters which will be considered by the spark
job if these are not configured at the time of submitting job to the
required values.
# - SPARK_EXECUTOR_INSTANCES, Number of workers to start (Default: 2)
# - SPARK_EXECUTOR_CORES, Numbe
This may be within your yarn constraints, but you can look at the configuration
parameters of your yarn
On 7/25/2019 20:23,Amit Sharma wrote:
I have cluster with 26 nodes having 16 cores on each. I am running a spark job
with 20 cores but i did not understand why my application get 1-2 cores on
I have cluster with 26 nodes having 16 cores on each. I am running a spark
job with 20 cores but i did not understand why my application get 1-2 cores
on couple of machines why not it just run on two nodes like node1=16 cores
and node 2=4 cores . but cores are allocated like node1=2 node
=1