Hello,

I've set spark.deploy.spreadOut=false in spark-env.sh.

> export SPARK_MASTER_OPTS="-Dspark.deploy.defaultCores=4
> -Dspark.deploy.spreadOut=false"


There are 3 workers each with 4 cores. Spark-shell was started with noof
cores = 6.
Spark UI show that one executor was used with 6 cores.

Is this a bug? This is with Spark 1.4.

[image: Inline image 1]

Srikanth

Reply via email to