Hello,
I've set spark.deploy.spreadOut=false in spark-env.sh.
export SPARK_MASTER_OPTS=-Dspark.deploy.defaultCores=4
-Dspark.deploy.spreadOut=false
There are 3 workers each with 4 cores. Spark-shell was started with noof
cores = 6.
Spark UI show that one executor was used with 6 cores.
Is
Hi Srikanth,
It does look like a bug. Did you set `spark.executor.cores` in your
application by any chance?
-Andrew
2015-07-22 8:05 GMT-07:00 Srikanth srikanth...@gmail.com:
Hello,
I've set spark.deploy.spreadOut=false in spark-env.sh.
export
Cool. Thanks!
Srikanth
On Wed, Jul 22, 2015 at 3:12 PM, Andrew Or and...@databricks.com wrote:
Hi Srikanth,
I was able to reproduce the issue by setting `spark.cores.max` to a number
greater than the number of cores on a worker. I've filed SPARK-9260 which I
believe is already being fixed
Hi Srikanth,
I was able to reproduce the issue by setting `spark.cores.max` to a number
greater than the number of cores on a worker. I've filed SPARK-9260 which I
believe is already being fixed in https://github.com/apache/spark/pull/7274.
Thanks for reporting the issue!
-Andrew
2015-07-22