Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
MartijnVisser commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1601097465 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: > May be I didn't get your message about java 21 support, however it is mentioned in 1.19 release notes 臘 I mixed up Flink 1.18 and 1.19, sorry. My comment was incorrect -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
snuyanzin commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1601012883 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: May be I didn't get your message about java 21 support, however it is mentioned in 1.19 release notes [1] Also we have nightly with jdk21 tests in Flink main repo (same as for jdk17) starting 1.19 e.g. [2] [1] https://flink.apache.org/2024/03/18/announcing-the-release-of-apache-flink-1.19/#beta-support-for-java-21 [2] https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=59560=results -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
snuyanzin commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1601012883 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: May be I didn't get your message about java 21 support, however it is mentioned in 1.19 release notes [1] Also we have nightly with jdk21 tests (same as for jdk17) starting 1.19 e.g. [2] [1] https://flink.apache.org/2024/03/18/announcing-the-release-of-apache-flink-1.19/#beta-support-for-java-21 [2] https://dev.azure.com/apache-flink/apache-flink/_build/results?buildId=59560=results -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
ruanhang1993 commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1600864186 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: Hi, @MartijnVisser . I check these files in both jdbc connector[1] and mongodb connector[2]. It seems like they all use the Java 21 for Flink 1.19. Do I need to change them too? [1] https://github.com/apache/flink-connector-jdbc/blob/main/.github/workflows/weekly.yml [2] https://github.com/apache/flink-connector-mongodb/blob/main/.github/workflows/weekly.yml -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
ruanhang1993 commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1600860337 ## flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/TypeSerializerUpgradeTestBase.java: ## @@ -43,28 +40,19 @@ import java.nio.file.Files; import java.nio.file.Path; import java.nio.file.Paths; +import java.util.ArrayList; import java.util.Collection; -import java.util.Set; +import java.util.List; import static org.apache.flink.util.Preconditions.checkNotNull; import static org.apache.flink.util.Preconditions.checkState; -import static org.assertj.core.api.AssertionsForInterfaceTypes.assertThat; +import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assumptions.assumeThat; -import static org.hamcrest.CoreMatchers.not; -/** - * A test base for testing {@link TypeSerializer} upgrades. - * - * You can run {@link #generateTestSetupFiles(TestSpecification)} on a Flink branch to - * (re-)generate the test data files. - */ +/** A test base for testing {@link TypeSerializer} upgrades. */ @TestInstance(TestInstance.Lifecycle.PER_CLASS) -public abstract class TypeSerializerUpgradeTestBase { - -public static final FlinkVersion CURRENT_VERSION = FlinkVersion.v1_17; - -public static final Set MIGRATION_VERSIONS = -FlinkVersion.rangeOf(FlinkVersion.v1_11, CURRENT_VERSION); +public abstract class TypeSerializerUpgradeTestBase Review Comment: Hi, @MartijnVisser . The changes for `TypeSerializerUpgradeTestBase` in [Flink#24603](https://github.com/apache/flink/pull/24603/files#) and [Flink#23960](https://github.com/apache/flink/pull/23960/files) make the kafka connector not be able to compile with both 1.20-SNAPSHOT and 1.19.0. I think we still have to maintain `TypeSerializerUpgradeTestBase`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
ruanhang1993 commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1600864186 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: Hi, @MartijnVisser . I check these files in both jdbc connector and mongodb connector. It seems like they all use the Java 21 for Flink 1.19. Do I need to change them too? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
ruanhang1993 commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1600860337 ## flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/TypeSerializerUpgradeTestBase.java: ## @@ -43,28 +40,19 @@ import java.nio.file.Files; import java.nio.file.Path; import java.nio.file.Paths; +import java.util.ArrayList; import java.util.Collection; -import java.util.Set; +import java.util.List; import static org.apache.flink.util.Preconditions.checkNotNull; import static org.apache.flink.util.Preconditions.checkState; -import static org.assertj.core.api.AssertionsForInterfaceTypes.assertThat; +import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assumptions.assumeThat; -import static org.hamcrest.CoreMatchers.not; -/** - * A test base for testing {@link TypeSerializer} upgrades. - * - * You can run {@link #generateTestSetupFiles(TestSpecification)} on a Flink branch to - * (re-)generate the test data files. - */ +/** A test base for testing {@link TypeSerializer} upgrades. */ @TestInstance(TestInstance.Lifecycle.PER_CLASS) -public abstract class TypeSerializerUpgradeTestBase { - -public static final FlinkVersion CURRENT_VERSION = FlinkVersion.v1_17; - -public static final Set MIGRATION_VERSIONS = -FlinkVersion.rangeOf(FlinkVersion.v1_11, CURRENT_VERSION); +public abstract class TypeSerializerUpgradeTestBase Review Comment: Hi, @MartijnVisser . The changes for `TypeSerializerUpgradeTestBase` in #24603 and #23960 make the kafka connector not be able to compile with both 1.20-SNAPSHOT and 1.19.0. I think we still have to maintain `TypeSerializerUpgradeTestBase`. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
MartijnVisser commented on code in PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#discussion_r1599848408 ## .github/workflows/push_pr.yml: ## @@ -28,21 +28,16 @@ jobs: compile_and_test: strategy: matrix: -flink: [ 1.17.2 ] -jdk: [ '8, 11' ] -include: - - flink: 1.18.1 -jdk: '8, 11, 17' - - flink: 1.19.0 -jdk: '8, 11, 17, 21' +flink: [ 1.19.0, 1.20-SNAPSHOT ] +jdk: [ '8, 11, 17, 21' ] Review Comment: Flink 1.19 doesn't support Java 21, so you'll probably have to make the correct changes here. ## flink-connector-kafka/src/test/java/org/apache/flink/streaming/connectors/kafka/testutils/TypeSerializerUpgradeTestBase.java: ## @@ -43,28 +40,19 @@ import java.nio.file.Files; import java.nio.file.Path; import java.nio.file.Paths; +import java.util.ArrayList; import java.util.Collection; -import java.util.Set; +import java.util.List; import static org.apache.flink.util.Preconditions.checkNotNull; import static org.apache.flink.util.Preconditions.checkState; -import static org.assertj.core.api.AssertionsForInterfaceTypes.assertThat; +import static org.assertj.core.api.Assertions.assertThat; import static org.assertj.core.api.Assumptions.assumeThat; -import static org.hamcrest.CoreMatchers.not; -/** - * A test base for testing {@link TypeSerializer} upgrades. - * - * You can run {@link #generateTestSetupFiles(TestSpecification)} on a Flink branch to - * (re-)generate the test data files. - */ +/** A test base for testing {@link TypeSerializer} upgrades. */ @TestInstance(TestInstance.Lifecycle.PER_CLASS) -public abstract class TypeSerializerUpgradeTestBase { - -public static final FlinkVersion CURRENT_VERSION = FlinkVersion.v1_17; - -public static final Set MIGRATION_VERSIONS = -FlinkVersion.rangeOf(FlinkVersion.v1_11, CURRENT_VERSION); +public abstract class TypeSerializerUpgradeTestBase Review Comment: Can't we now get rid of this one completely? I thought we copied this one over as part of https://issues.apache.org/jira/browse/FLINK-32455?focusedCommentId=17739785=com.atlassian.jira.plugin.system.issuetabpanels%3Acomment-tabpanel#comment-17739785 as a temporary solution? -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org
Re: [PR] [FLINK-35109] Drop support for Flink 1.17 & 1.18 and fix tests for 1.20-SNAPSHOT [flink-connector-kafka]
ruanhang1993 commented on PR #102: URL: https://github.com/apache/flink-connector-kafka/pull/102#issuecomment-2109819942 @MartijnVisser @fapaul Please help to review this PR, thanks~ -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: issues-unsubscr...@flink.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org