hi~I want to set the executor number to 16, but it is very strange that
executor cores may affect executor num on spark on yarn, i don't know why
and how to set executor number.
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
  *  --executor-cores 4 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *7 executors*
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
*    --executor-cores 2 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *9 executors*
=============================================
./bin/spark-submit --class com.hequn.spark.SparkJoins \
    --master yarn-cluster \
    --num-executors 16 \
    --driver-memory 2g \
    --executor-memory 10g \
*    --executor-cores 1 \*
    /home/sparkjoins-1.0-SNAPSHOT.jar

The UI shows there are *9 executors*
==============================================
The cluster contains 16 nodes. Each node 64G RAM.

Reply via email to