[ https://issues.apache.org/jira/browse/SPARK-4585?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14265915#comment-14265915 ]
Lianhui Wang commented on SPARK-4585: ------------------------------------- yes, i think initial executors number can be speculated. in most of cases, i think initial executors number is tasks number of first level running stages. > Spark dynamic executor allocation shouldn't use maxExecutors as initial number > ------------------------------------------------------------------------------ > > Key: SPARK-4585 > URL: https://issues.apache.org/jira/browse/SPARK-4585 > Project: Spark > Issue Type: Improvement > Components: Spark Core, YARN > Affects Versions: 1.1.0 > Reporter: Chengxiang Li > > With SPARK-3174, one can configure a minimum and maximum number of executors > for a Spark application on Yarn. However, the application always starts with > the maximum. It seems more reasonable, at least for Hive on Spark, to start > from the minimum and scale up as needed up to the maximum. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org