[ https://issues.apache.org/jira/browse/SPARK-8099?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Sandy Ryza updated SPARK-8099: ------------------------------ Assignee: meiyoula > In yarn-cluster mode, "--executor-cores" can't be setted into SparkConf > ----------------------------------------------------------------------- > > Key: SPARK-8099 > URL: https://issues.apache.org/jira/browse/SPARK-8099 > Project: Spark > Issue Type: Bug > Components: YARN > Affects Versions: 1.0.0 > Reporter: meiyoula > Assignee: meiyoula > Fix For: 1.5.0 > > > While testing dynamic executor allocation function, I set the executor cores > with *--executor-cores 4* in spark-submit command. But in > *ExecutorAllocationManager*, the *private val tasksPerExecutor > =conf.getInt("spark.executor.cores", 1) / conf.getInt("spark.task.cpus", 1)* > is still to be 1. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org