GitHub user yihua added a comment to the discussion: 1.2 Release Planning

@suryaprasanna Maintaining the Hudi integration for more than 3 or 4 Spark 
versions brings overhead to development, as we need to test against all these 
versions for new features and account for version-specific APIs as Hudi is 
deeply integrated with Spark internals.  There are certain logic that's Spark 
3.3 specific which can be removed once Spark 3.3 is deprecated, simplifying the 
code base. The latest patch release of Spark 3.3 is Spark 3.3.4, which is 
released almost 2 years ago.  Spark 4.1.0 is coming soon 
(https://spark.apache.org/news/spark-4-1-0-preview4-released.html).  So IMO we 
should gradually remove old Spark version support based on the latest Spark 
releases, and focus more on new Spark versions (e.g., Spark 3.5.x and latest 
Spark 4.x which are actively maintained by Spark community) to save our 
development cycles.  Wdyt?

GitHub link: 
https://github.com/apache/hudi/discussions/14307#discussioncomment-15091345

----
This is an automatically sent email for [email protected].
To unsubscribe, please send an email to: [email protected]

Reply via email to