[ https://issues.apache.org/jira/browse/SPARK-20589?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16043877#comment-16043877 ]
Fei Shao commented on SPARK-20589: ---------------------------------- Tasks are assigned to executors. If we set the number of executors to 5 and set the simulaneous task number to 2, a contradiction occurs here. So can we change the requirement to "allow limiting task concurrency per executor" please? > Allow limiting task concurrency per stage > ----------------------------------------- > > Key: SPARK-20589 > URL: https://issues.apache.org/jira/browse/SPARK-20589 > Project: Spark > Issue Type: Improvement > Components: Scheduler > Affects Versions: 2.1.0 > Reporter: Thomas Graves > > It would be nice to have the ability to limit the number of concurrent tasks > per stage. This is useful when your spark job might be accessing another > service and you don't want to DOS that service. For instance Spark writing > to hbase or Spark doing http puts on a service. Many times you want to do > this without limiting the number of partitions. -- This message was sent by Atlassian JIRA (v6.3.15#6346) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org