Noritaka Sekiyama created SPARK-32112:
-----------------------------------------

             Summary: Add a method to calculate the number of parallel tasks 
that Spark can process at the same time
                 Key: SPARK-32112
                 URL: https://issues.apache.org/jira/browse/SPARK-32112
             Project: Spark
          Issue Type: Improvement
          Components: Spark Core
    Affects Versions: 3.0.0
            Reporter: Noritaka Sekiyama


Repartition/coalesce is very important to optimize Spark application's 
performance, however, a lot of users are struggling with determining the number 
of partitions.
This issue is to add a method to calculate the number of parallel tasks that 
Spark can process at the same time.

It will help Spark users to determine the optimal number of partitions.

Expected use-cases:

- repartition with the calculated parallel tasks



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to