Fu Chen created SPARK-26369:
-------------------------------

             Summary: How to limit Spark concurrent tasks number in one job?
                 Key: SPARK-26369
                 URL: https://issues.apache.org/jira/browse/SPARK-26369
             Project: Spark
          Issue Type: Question
          Components: Scheduler
    Affects Versions: 2.4.0, 2.3.2, 2.2.0, 2.1.0
            Reporter: Fu Chen


Hi All,
it is possible make fair scheduler pools pluggable? so that we can
implement our own SchedulingAlgorithm. In our case, we want to limit the
max tasks number of one job which will load data from mysql database, if we
set a bigger executer.number * cores.number, it will trigger alarm. Or we
can do this in an other way?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to