MartijnVisser commented on PR #23489: URL: https://github.com/apache/flink/pull/23489#issuecomment-1857914683
There are two possibilities to solve this: 1. Backport https://issues.apache.org/jira/browse/FLINK-33704 to `release-1.18` - This resolves the issue for 1.18 2. Merge the commit in my updated PR https://github.com/apache/flink/pull/23920/commits/7c609007959cfdc2d3e6bd9634c1576b437a611a that excludes Guava from `flink-fs-hadoop-shaded` and includes it, together with the needed `com.google.guava:failureaccess` for `flink-gs-fs-hadoop`. Both of these approaches make checkpointing work again, I've validated that locally. In `master` (so the upcoming Flink 1.19.0 release) this problem does not appear. @JingGe @singhravidutt @snuyanzin Any preference? I'm leaning towards 1, since that would be the same solution for both 1.18 and 1.19, with option 2 being a specific fix for 1.18 only. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org