Hi,

The documentation mentions Spark 3.5 or greater is required for Spark
dynamic allocation to be supported through Celeborn:
https://celeborn.apache.org/docs/latest/deploy/#spark-configuration
# Support Spark Dynamic Resource Allocation # Required Spark version >=
3.5.0

Can I confirm that the only reason for this (Spark v3.5 or greater
pre-requisite) is because versions less than 3.5 require the Spark patches
described here to be integrated, first?
https://github.com/apache/incubator-celeborn?tab=readme-ov-file#support-spark-dynamic-allocation
For example, once patched and rebuilt, Spark dynamic allocation with, say,
Spark v3.2 and Celeborn should function as expected.

Thank you
Curtis

Reply via email to