Hi Everyone,

While testing my recent changes, we discovered a bug in the Iceberg Spark
code affecting queries. More detail about the bug:
https://github.com/apache/iceberg/issues/16283

Specifically, SerializableFileIOWithSize fails to override the
newInputFile(String path, long length) method. This oversight causes the
file length property to be dropped during Spark execution. We created a PR
to fix this bug: https://github.com/apache/iceberg/pull/16284

Talat


On Sat, May 9, 2026 at 5:43 PM Aihua Xu <[email protected]> wrote:

> Thanks for calling this out. This seems to be a blocker. I’ll pause and
> hold off on proceeding until it’s resolved.
>
>
> On Sat, May 9, 2026 at 10:03 AM Amogh Jahagirdar <[email protected]> wrote:
>
>> Hey folks, on our side we were looking at a customer issue yesterday and
>> observed that the first row IDs for existing entries weren't carried over
>> as expected when moving them to a new manifest. This violates the spec and
>> is naturally unexpected. I've published a draft
>> <https://github.com/apache/iceberg/pull/16263> with a minimal test
>> reproducing the issue and a possible fix, though I'm verifying that it's
>> the right fix. It would be good to get more eyes on this. Once we verify
>> it's a legitimate issue, I unfortunately think it's a release blocker
>> because it involves metadata corruption.
>>
>> Thanks,
>> Amogh Jahagirdar
>>
>> On Fri, May 8, 2026 at 3:31 PM Talat Uyarer <[email protected]> wrote:
>>
>>> Created a PR for this: https://github.com/apache/iceberg/pull/16258
>>>
>>> On 2026/05/08 20:04:04 Russell Spitzer wrote:
>>> > Just post here when you have it
>>> >
>>> > On Fri, May 8, 2026 at 3:03 PM Talat Uyarer <[email protected]> wrote:
>>> >
>>> > > I am working on a solution to decouple GCS and Analytic core if
>>> feature is
>>> > > not enabled. I will create a PR soon. But I need reviewer.
>>> > >
>>> > > On 2026/05/08 19:11:23 Kevin Liu wrote:
>>> > > > Looks like all the changes Peter linked above are now merged.
>>> > > >
>>> > > > I also merged the Azure PR (
>>> https://github.com/apache/iceberg/pull/16186
>>> > > ),
>>> > > > thanks for catching this.
>>> > > >
>>> > > > For the "GCS Analytics Core" issue, here's how I understand it:
>>> > > > - Users using `gcp-bundle` are fine, confirmed by Yuya in Trino,
>>> since
>>> > > the
>>> > > > bundle includes `gcs-analytics-core` in its shadow jar.
>>> > > > - Users using just `iceberg-gcp` (for GCSFileIO, etc.) will see
>>> > > > `NoClassDefFoundError` even though `gcs.analytics-core.enabled`
>>> defaults
>>> > > to
>>> > > > false.
>>> > > > *So `gcs-analytics-core` is a new required dependency to use
>>> GCSFileIO,
>>> > > > because analytics-core classes are eagerly loaded regardless of
>>> > > > configuration.*
>>> > > >
>>> > > > I also verified all 3 cloud bundles (AWS, Azure, GCP) for similar
>>> issues
>>> > > > and didn't find any additional cases.
>>> > > >
>>> > > > Best,
>>> > > > Kevin Liu
>>> > > >
>>> > > >
>>> > > > On Fri, May 8, 2026 at 11:15 AM Steve Loughran <
>>> [email protected]>
>>> > > wrote:
>>> > > >
>>> > > > > I did actually get claude to do a packaging audit,
>>> > > > >
>>> > > > > verifying checksums, signatures, source code == tag *and* that
>>> the
>>> > > jars in
>>> > > > > nexus match those I get in a local build, i cover that process a
>>> bit
>>> > > more
>>> > > > > on the dev@parquet list for the curious; this one was just
>>> giving the
>>> > > > > claude session the new vote email and telling it to build with
>>> > > ./gradlew -x
>>> > > > > test -x integrationTest
>>> > > > >
>>> > > > > no problems there git source == .tar source ==> nexus artifacts
>>> > > > >
>>> > > > > On Fri, 8 May 2026 at 18:04, Steven Wu <[email protected]>
>>> wrote:
>>> > > > >
>>> > > > >> We will build RC2 from the latest main branch tonight.
>>> > > > >>
>>> > > > >> On Fri, May 8, 2026 at 8:27 AM Péter Váry <
>>> > > [email protected]>
>>> > > > >> wrote:
>>> > > > >>
>>> > > > >>> Just to clarify:
>>> > > > >>>
>>> > > > >>> The following PRs are already merged to 1.11.0:
>>> > > > >>>
>>> > > > >>>    - https://github.com/apache/iceberg/pull/14297 - Spark:
>>> Support
>>> > > > >>>    writing shredded variant in Iceberg-Spark
>>> > > > >>>    - https://github.com/apache/iceberg/pull/15512 - Spark: fix
>>> > > delete
>>> > > > >>>    from branch for canDeleteWhere where it does not resolve to
>>> the
>>> > > correct
>>> > > > >>>    branch - WAP fix
>>> > > > >>>    - https://github.com/apache/iceberg/pull/15475 - Flink: Add
>>> > > > >>>    Nanosecond Precision Support for Flink-Iceberg Integration
>>> > > > >>>
>>> > > > >>>
>>> > > > >>> The missing ones are the ones backporting those to other engine
>>> > > versions:
>>> > > > >>>
>>> > > > >>>    - For: 14297 <https://github.com/apache/iceberg/pull/14297
>>> >:
>>> > > > >>>       - 16241 <https://github.com/apache/iceberg/pull/16241> -
>>> > > Backport
>>> > > > >>>       for variant shredding in Spark 4.0
>>> > > > >>>    - For: 15512 <https://github.com/apache/iceberg/pull/15512
>>> >:
>>> > > > >>>       - 16245 <https://github.com/apache/iceberg/pull/16245> -
>>> > > Spark:
>>> > > > >>>       backport PR #15512 to v3.4, v3.5, v4.0 for WAP branch
>>> delete
>>> > > fix
>>> > > > >>>    - For: 15475 <https://github.com/apache/iceberg/pull/15475
>>> >:
>>> > > > >>>       - #16183 <https://github.com/apache/iceberg/pull/16183>,
>>> > > #16239
>>> > > > >>>       <https://github.com/apache/iceberg/pull/16239>, #16240
>>> > > > >>>       <https://github.com/apache/iceberg/pull/16240> -
>>> Backport for
>>> > > > >>>       Nano timestamps for Flink 2.0/1.20
>>> > > > >>>
>>> > > > >>>
>>> > > > >>> So the PRs needed on 1.11.0 are:
>>> > > > >>> https://github.com/apache/iceberg/pull/16241
>>> > > > >>> https://github.com/apache/iceberg/pull/16245
>>> > > > >>> https://github.com/apache/iceberg/pull/16183
>>> > > > >>> https://github.com/apache/iceberg/pull/16239
>>> > > > >>> https://github.com/apache/iceberg/pull/16240
>>> > > > >>> https://github.com/apache/iceberg/pull/16186
>>> > > > >>>
>>> > > > >>> Aihua Xu <[email protected]> ezt írta (időpont: 2026. máj.
>>> 8., P,
>>> > > > >>> 17:13):
>>> > > > >>>
>>> > > > >>>> Thank you all for the feedback and for verifying the release
>>> > > candidate.
>>> > > > >>>> Based on the issues identified above, we will include the
>>> following
>>> > > fixes
>>> > > > >>>> and cut RC2 with a new vote:
>>> > > > >>>>
>>> > > > >>>> https://github.com/apache/iceberg/pull/14297
>>> > > > >>>> https://github.com/apache/iceberg/pull/15512
>>> > > > >>>> https://github.com/apache/iceberg/pull/15475
>>> > > > >>>> https://github.com/apache/iceberg/pull/16186
>>> > > > >>>>
>>> > > > >>>> Please let me know if you have any questions or identified
>>> > > additional
>>> > > > >>>> issues.
>>> > > > >>>>
>>> > > > >>>> Thanks,
>>> > > > >>>> Aihua
>>> > > > >>>>
>>> > > > >>>> On Thu, May 7, 2026 at 10:09 PM Aihua Xu <[email protected]>
>>> wrote:
>>> > > > >>>>
>>> > > > >>>>> I also looked into this. There is a configuration
>>> > > > >>>>> gcs.analytics-core.enabled to enable/disable GCS Analytics
>>> Core.
>>> > > The
>>> > > > >>>>> current implementation always requires runtime dependency of
>>> GCS
>>> > > Analytics
>>> > > > >>>>> Core even if the configuration is off. Ideally we can lazy
>>> load
>>> > > such
>>> > > > >>>>> dependency so the dependency is only required when the
>>> feature is
>>> > > > >>>>> explicitly enabled. But since GCP is likely to enable GCS
>>> > > Analytics Core by
>>> > > > >>>>> default, I feel it's reasonable for downstream projects using
>>> > > non-bundle
>>> > > > >>>>> jars to add this dependency.
>>> > > > >>>>>
>>> > > > >>>>>
>>> > > > >>>>> On Thu, May 7, 2026 at 6:54 PM Steven Wu <
>>> [email protected]>
>>> > > wrote:
>>> > > > >>>>>
>>> > > > >>>>>> Looked a little more.
>>> > > > >>>>>>
>>> > > > >>>>>> So Iceberg's cloud modules consistently use compileOnly for
>>> vendor
>>> > > > >>>>>> SDKs and rely on either the bundle artifact or downstream
>>> > > coordination for
>>> > > > >>>>>> runtime. So, both changes are expected for downstream
>>> consumers
>>> > > using the
>>> > > > >>>>>> non-bundle jars. Maybe we don't need to change anything.
>>> > > > >>>>>>
>>> > > > >>>>>> iceberg-gcp module
>>> > > > >>>>>>
>>> > > > >>>>>> compileOnly platform(libs.google.libraries.bom)
>>> > > > >>>>>> compileOnly "com.google.cloud:google-cloud-storage"
>>> > > > >>>>>> compileOnly "com.google.cloud:google-cloud-kms"
>>> > > > >>>>>> compileOnly(libs.gcs.analytics.core)
>>> > > > >>>>>>
>>> > > > >>>>>>
>>> > > > >>>>>> On Thu, May 7, 2026 at 6:16 PM Steven Wu <
>>> [email protected]>
>>> > > > >>>>>> wrote:
>>> > > > >>>>>>
>>> > > > >>>>>>> Yuya, thanks for reporting the discovery.
>>> > > > >>>>>>>
>>> > > > >>>>>>> Azure: I approved your PR and can merge it soon:
>>> > > > >>>>>>> https://github.com/apache/iceberg/pull/16186
>>> > > > >>>>>>> GCP: the new dependency is marked as compileOnly in PR
>>> 14333
>>> > > > >>>>>>> <https://github.com/apache/iceberg/pull/14333>, as it is
>>> an
>>> > > opt-in
>>> > > > >>>>>>> feature. we need to either change the dep to
>>> implementation or
>>> > > update the
>>> > > > >>>>>>> code similar to the Azure fix above.
>>> > > > >>>>>>>
>>> > > > >>>>>>>
>>> > > > >>>>>>> On Thu, May 7, 2026 at 4:07 PM Yuya Ebihara <
>>> > > > >>>>>>> [email protected]> wrote:
>>> > > > >>>>>>>
>>> > > > >>>>>>>> Hi Aihua,
>>> > > > >>>>>>>>
>>> > > > >>>>>>>> Thanks for leading the release!
>>> > > > >>>>>>>>
>>> > > > >>>>>>>> Just a quick reminder about two dependency-related items
>>> from a
>>> > > > >>>>>>>> downstream perspective:
>>> > > > >>>>>>>> * Azure module users will require
>>> azure-security-keyvault-keys,
>>> > > > >>>>>>>> even when table encryption is not used, as noted in
>>> > > > >>>>>>>> https://github.com/apache/iceberg/pull/16186
>>> > > > >>>>>>>> * GCS module users will require gcs-analytics-core
>>> > > > >>>>>>>>
>>> > > > >>>>>>>> I ran into CI failures with 1.11.0 in Trino because the
>>> project
>>> > > > >>>>>>>> does not use the azure-bundle or gcp-bundle modules.
>>> > > > >>>>>>>> The CI passed once we explicitly added these two
>>> dependencies.
>>> > > > >>>>>>>>
>>> > > > >>>>>>>> Thanks,
>>> > > > >>>>>>>> Yuya Ebihara
>>> > > > >>>>>>>>
>>> > > > >>>>>>>> On Fri, May 8, 2026 at 4:58 AM Péter Váry <
>>> > > > >>>>>>>> [email protected]> wrote:
>>> > > > >>>>>>>>
>>> > > > >>>>>>>>> First of all, thanks to everyone for the effort put into
>>> > > preparing
>>> > > > >>>>>>>>> this release!
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>> I would like to highlight that RC1 is built from a
>>> branch where
>>> > > > >>>>>>>>> the following features have not been backported to all
>>> engine
>>> > > versions:
>>> > > > >>>>>>>>> - Spark: Support writing shredded variant in
>>> Iceberg-Spark (
>>> > > > >>>>>>>>> https://github.com/apache/iceberg/pull/14297) -
>>> Available in
>>> > > > >>>>>>>>> Spark 4.1, but not in Spark 4.0
>>> > > > >>>>>>>>> - Spark: fix delete from branch for canDeleteWhere where
>>> it
>>> > > does
>>> > > > >>>>>>>>> not resolve to the correct branch (
>>> > > > >>>>>>>>> https://github.com/apache/iceberg/pull/15512) -
>>> Available in
>>> > > > >>>>>>>>> Spark 4.1, but not in Spark 4.0, 3.5, or 3.4
>>> > > > >>>>>>>>> - Flink: Add Nanosecond Precision Support for
>>> Flink-Iceberg
>>> > > > >>>>>>>>> Integration (
>>> https://github.com/apache/iceberg/pull/15475) -
>>> > > > >>>>>>>>> Available in Flink 2.1, but not in Flink 2.0 or 1.20
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>> It is up to the community to decide whether these missing
>>> > > > >>>>>>>>> backports should be considered release blockers. Most of
>>> the
>>> > > corresponding
>>> > > > >>>>>>>>> PRs have already been merged to main (except #15512), and
>>> > > including them in
>>> > > > >>>>>>>>> the release should be relatively straightforward.
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>> From my perspective, I would prefer not to release with
>>> these
>>> > > > >>>>>>>>> gaps. That said, I understand the urgency and the need
>>> for a
>>> > > release, and I
>>> > > > >>>>>>>>> am happy to go with the community’s decision.
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>> Peter
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>> Aihua Xu <[email protected]> ezt írta (időpont: 2026.
>>> máj. 7.,
>>> > > > >>>>>>>>> Cs, 18:26):
>>> > > > >>>>>>>>>
>>> > > > >>>>>>>>>> Hi Everyone,
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> I propose that we release the following RC as the
>>> official
>>> > > Apache
>>> > > > >>>>>>>>>> Iceberg 1.11.0 release.
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> The commit ID is
>>> 0f657edf12dc29f8487a679bfdd4210e9588d014
>>> > > > >>>>>>>>>> * This corresponds to the tag: apache-iceberg-1.11.0-rc1
>>> > > > >>>>>>>>>> *
>>> > > > >>>>>>>>>>
>>> > > https://github.com/apache/iceberg/commits/apache-iceberg-1.11.0-rc1
>>> > > > >>>>>>>>>> *
>>> > > > >>>>>>>>>>
>>> > >
>>> https://github.com/apache/iceberg/tree/0f657edf12dc29f8487a679bfdd4210e9588d014
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> The release tarball, signature, and checksums are here:
>>> > > > >>>>>>>>>> *
>>> > > > >>>>>>>>>>
>>> > >
>>> https://dist.apache.org/repos/dist/dev/iceberg/apache-iceberg-1.11.0-rc1
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> You can find the KEYS file here:
>>> > > > >>>>>>>>>> * https://downloads.apache.org/iceberg/KEYS
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> Convenience binary artifacts are staged on Nexus. The
>>> Maven
>>> > > > >>>>>>>>>> repository URL is:
>>> > > > >>>>>>>>>> *
>>> > > > >>>>>>>>>>
>>> > >
>>> https://repository.apache.org/content/repositories/orgapacheiceberg-1278/
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> Please download, verify, and test.
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> Instructions for verifying a release can be found here:
>>> > > > >>>>>>>>>> *
>>> > > > >>>>>>>>>>
>>> > > https://iceberg.apache.org/how-to-release/#how-to-verify-a-release
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> Please vote in the next 72 hours.
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> [ ] +1 Release this as Apache Iceberg 1.11.0
>>> > > > >>>>>>>>>> [ ] +0
>>> > > > >>>>>>>>>> [ ] -1 Do not release this because...
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>> Only PMC members have binding votes, but other community
>>> > > members
>>> > > > >>>>>>>>>> are encouraged to cast
>>> > > > >>>>>>>>>> non-binding votes. This vote will pass if there are 3
>>> binding
>>> > > +1
>>> > > > >>>>>>>>>> votes and more binding
>>> > > > >>>>>>>>>> +1 votes than -1 votes.
>>> > > > >>>>>>>>>>
>>> > > > >>>>>>>>>>
>>> > > >
>>> > >
>>> >
>>>
>>

Reply via email to