Hi Srikanth,

It does look like a bug. Did you set `spark.executor.cores` in your
application by any chance?

-Andrew

2015-07-22 8:05 GMT-07:00 Srikanth <srikanth...@gmail.com>:

> Hello,
>
> I've set spark.deploy.spreadOut=false in spark-env.sh.
>
>> export SPARK_MASTER_OPTS="-Dspark.deploy.defaultCores=4
>> -Dspark.deploy.spreadOut=false"
>
>
> There are 3 workers each with 4 cores. Spark-shell was started with noof
> cores = 6.
> Spark UI show that one executor was used with 6 cores.
>
> Is this a bug? This is with Spark 1.4.
>
> [image: Inline image 1]
>
> Srikanth
>

Reply via email to