Thanks Anton and Eduard. I'm ok with being more aggressive with the deprecation schedule. Looking at the git history for `spark/v3.4/` <https://github.com/apache/iceberg/commits/main/spark/v3.4>, there are 5 new commits since the 1.10 release. Only 1 commit (3bbdee9 <https://github.com/apache/iceberg/commit/3bbdee97b0f33794df091c3292de8d54a05dcc1e>) is to backport specifically for Spark 3.4.
>From Engine Version Lifecycle <https://iceberg.apache.org/multi-engine-support/#engine-version-lifecycle> , > Deprecated: an engine version is no longer actively maintained. People who are still interested in the version can backport any necessary feature or bug fix from newer versions, but the community will not spend effort in achieving feature parity. Iceberg recommends users to move towards a newer version. Contributions to a deprecated version is expected to diminish over time, so that eventually no change is added to a deprecated version. and > End-of-life: a vote can be initiated in the community to fully remove a deprecated version out of the Iceberg repository to mark as its end of life. Let's change the status for Spark 3.4 to "Deprecated". I have prepared a PR already https://github.com/apache/iceberg/pull/14099 I can start another vote thread for removal of Spark 3.4 in the upcoming 1.11 release and mark it as "End-of-life". I already prepared a PR to fully remove Spark 3.4 from the codebase, https://github.com/apache/iceberg/pull/14122 Best, Kevin Liu On Fri, Sep 19, 2025 at 11:29 AM Eduard Tudenhöfner < [email protected]> wrote: > I agree with Anton and I would be in favor of just removing it in the next > release. By updating the docs now we can already signal immediately that > Spark 3.4 is deprecated and people can always use Iceberg 1.10 when needing > Spark 3.4 support. > > On Fri, Sep 19, 2025 at 7:06 PM Anton Okolnychyi <[email protected]> > wrote: > >> I know we followed this rule of deprecating a Spark version in one >> release and then removing it in the next one. Shall we ask ourselves >> whether it is still the model we want to follow? >> >> My problem like before is that we release a new Iceberg jar that is >> supposed to contain the latest and greatest features but the functionality >> for older Spark versions is severely lagging. >> >> We initially kept older Spark modules in main to give folks in the >> community a place to maintain these older integrations and collaborate. I >> don’t see a lot of interest in that, if I am being honest. Instead, it >> became a liability for devs as all major format features now have to work >> with those old Spark integrations. It is hurting the velocity of the >> project and recent row ID work is an example of that. >> >> - Anton >> >> On Fri, Sep 19, 2025 at 6:33 PM Kevin Liu <[email protected]> wrote: >> >>> > why not just remove Spark 3.4 for the next 1.11 release? Or do we >>> usually wait for one more release and remove it in the 1.12 release after >>> marking 3.4 as deprecated in the engine status doc page? >>> >>> My preference is to mark as deprecated for one release and remove in the >>> following. >>> >>> To quote JB: >>> "announce" the deprecation in 1.11 and remove 1.12, it gives time for >>> users to "adapt". >>> >>> Best, >>> Kevin Liu >>> >>> On Fri, Sep 19, 2025 at 9:26 AM Steven Wu <[email protected]> wrote: >>> >>>> Following up on Manu's question, why not just remove Spark 3.4 for the >>>> next 1.11 release? Or do we usually wait for one more release and remove it >>>> in the 1.12 release after marking 3.4 as deprecated in the engine status >>>> doc page? >>>> >>>> On Fri, Sep 19, 2025 at 9:12 AM Kevin Liu <[email protected]> >>>> wrote: >>>> >>>>> >>>>> Given the many +1's here, I've moved the PR to deprecate 3.4 to "ready >>>>> for review", https://github.com/apache/iceberg/pull/14099 >>>>> >>>>> > Does it mean we will stop back-porting PRs to Spark 3.4 for 1.11? >>>>> >>>>> Not necessarily. There's a lot of Spark 3.4 backports already, >>>>> https://github.com/apache/iceberg/commits/main/spark/v3.4 >>>>> I suggest we continue to backport for consistency and then stop right >>>>> after the 1.11 release. >>>>> >>>>> Best, >>>>> Kevin Liu >>>>> >>>>> On Fri, Sep 19, 2025 at 6:18 AM Amogh Jahagirdar <[email protected]> >>>>> wrote: >>>>> >>>>>> +1 >>>>>> >>>>>> On Fri, Sep 19, 2025 at 2:03 AM Péter Váry < >>>>>> [email protected]> wrote: >>>>>> >>>>>>> +1 >>>>>>> >>>>>>> Eduard Tudenhöfner <[email protected]> ezt írta (időpont: >>>>>>> 2025. szept. 19., P, 8:56): >>>>>>> >>>>>>>> +1 on deprecating Spark 3.4 >>>>>>>> >>>>>>>> On Thu, Sep 18, 2025 at 8:36 AM Steve <[email protected]> >>>>>>>> wrote: >>>>>>>> >>>>>>>>> +1 >>>>>>>>> >>>>>>>>> On Wed, Sep 17, 2025 at 22:52 Jean-Baptiste Onofré < >>>>>>>>> [email protected]> wrote: >>>>>>>>> >>>>>>>>>> +1 >>>>>>>>>> >>>>>>>>>> I agree about the plan to "announce" the deprecation in 1.11 and >>>>>>>>>> remove 1.12, it gives time for users to "adapt". >>>>>>>>>> >>>>>>>>>> Regards >>>>>>>>>> JB >>>>>>>>>> >>>>>>>>>> On Wed, Sep 17, 2025 at 10:31 PM Kevin Liu <[email protected]> >>>>>>>>>> wrote: >>>>>>>>>> > >>>>>>>>>> > Hi everyone, >>>>>>>>>> > >>>>>>>>>> > I’d like to bring up the topic of deprecating Spark 3.4 in an >>>>>>>>>> upcoming release. Anton initially suggested this during our previous >>>>>>>>>> dev >>>>>>>>>> list discussion about maintaining feature parity across the Spark >>>>>>>>>> versions >>>>>>>>>> we support for 1.10. >>>>>>>>>> > >>>>>>>>>> > Currently, we support two different Spark 3.x versions, 3.4 and >>>>>>>>>> 3.5. Spark 3.4’s last maintenance release was in October 2024, and >>>>>>>>>> it is >>>>>>>>>> now considered end-of-life. >>>>>>>>>> > >>>>>>>>>> > What are your thoughts on marking Spark 3.4 as deprecated in >>>>>>>>>> 1.11 and removing it in 1.12? >>>>>>>>>> > >>>>>>>>>> > For reference, here's the previous discussion thread on >>>>>>>>>> deprecating Spark 3.3. >>>>>>>>>> > >>>>>>>>>> > Best, >>>>>>>>>> > >>>>>>>>>> > Kevin Liu >>>>>>>>>> >>>>>>>>>
