liuxian created SPARK-21506: ------------------------------- Summary: The description of "spark.executor.cores" may be not correct Key: SPARK-21506 URL: https://issues.apache.org/jira/browse/SPARK-21506 Project: Spark Issue Type: Bug Components: Documentation Affects Versions: 2.3.0 Reporter: liuxian
This description for "spark.executor.cores" :"The number of cores assigned to each executor is configurable. When this is not explicitly set, only one executor per application will run the same worker " I think it is not correct, because if one application is not assigned enough cores in the first `schedule()`, another executor may be launched on the same worker in the next time. -- This message was sent by Atlassian JIRA (v6.4.14#64029) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org