Re: [DISCUSSION][JAVA] Current state of Java 17 support
We have run into som JDK-specific issues with our use of ByteBuddy though. On Thu, Dec 1, 2022 at 3:43 PM Luke Cwik via dev wrote: > We do support JDK8, JDK11 and JDK17. Our story around newer features > within JDKs 9+ like modules is mostly non-existent though. > > We rarely run into JDK specific issues, the latest were the TLS1 and > TLS1.1 deprecation in newer patch versions of the JDK and also the docker > cpu share issues with different JDK versions. Even though it would be nice > to cover more, we currently have too many flaky tests and an already busy > Jenkins cluster. I believe we would get a lot more value out of deflaking > our existing tests and re-enabling disabled tests. > > I got to give credit to the JDK folks for how well they have maintained > compatibility over the years. > > On Thu, Dec 1, 2022 at 9:05 AM Sachin Agarwal via dev > wrote: > >> This is a good heads up, thank you Cristian. >> >> On Thu, Dec 1, 2022 at 8:13 AM Cristian Constantinescu >> wrote: >> >>> Hi, >>> >>> I came across some Kafka info and would like to share for those >>> unaware. Kafka is planning to drop support for Java 8 in Kafka 4 (Java >>> 8 is deprecated in Kafka 3), see KIP-750 [1]. >>> >>> I'm not sure when Kafka 4 is scheduled to be released (probably a few >>> years down the road), but when it happens, KafkaIO may not be able to >>> support it if we maintain Java 8 compatibility unless it remains on >>> Kafka 3. >>> >>> Anyways, if not already done, I think it's a good idea to start >>> putting up serious warning flags around Beam used with Java 8, even >>> for Google cloud customers ;) >>> >>> Cheers, >>> Cristian >>> >>> [1] https://issues.apache.org/jira/browse/KAFKA-12894 >>> >>> On Wed, Nov 30, 2022 at 12:59 PM Kenneth Knowles >>> wrote: >>> > >>> > An important thing is to ensure that we do not accidentally depend on >>> something that would break Java 8 support. >>> > >>> > Currently our Java 11 and 17 tests build the code with Java 8 (just >>> like our released artifacts) and then compile and run the test code with >>> the newer JDK. This roughly matches the user scenario, I think. So it is a >>> little more complex than just having separate test runs for different JDK >>> versions. But it would be good to make this more symmetrical between JDK >>> versions to develop the mindset that JDK is always explicit. >>> > >>> > Kenn >>> > >>> > On Wed, Nov 30, 2022 at 9:48 AM Alexey Romanenko < >>> aromanenko@gmail.com> wrote: >>> >> >>> >> >>> >> On 30 Nov 2022, at 03:56, Tomo Suzuki via dev >>> wrote: >>> >> >>> >> > Do we still need to support Java 8 SDK? >>> >> >>> >> Yes, for Google Cloud customers who still use Java 8, I want Apache >>> Beam to support Java 8. Do you observe any special burden maintaining Java >>> 8? >>> >> >>> >> >>> >> I can only think of the additional resources costs if we will test >>> all supported JDKs, as Austin mentioned above. Imho, we should do that for >>> all JDK that are officially supported. >>> >> Another less-costly way is to run the Java tests for all JDKs only >>> during the release preparation stage. >>> >> >>> >> I agree that it would make sense to continue to support Java 8 until >>> a significant number of users are using it. >>> >> >>> >> — >>> >> Alexey >>> >> >>> >> >>> >> >>> >> Regards, >>> >> Tomo >>> >> >>> >> On Tue, Nov 29, 2022 at 21:48 Austin Bennett >>> wrote: >>> >>> >>> >>> -1 for ongoing Java8 support [ or, said another way, +1 for dropping >>> support of Java8 ] >>> >>> >>> >>> +1 for having tests that run for ANY JDK that we say we support. Is >>> there any reason the resources to support are too costly [ or outweigh the >>> benefits of additional confidence in ensuring we support what we say we do >>> ]? I am not certain on whether this would only be critical for releases, >>> or should be done as part of regular CI. >>> >>> >>> >>> On Tue, Nov 29, 2022 at 8:51 AM Alexey Romanenko < >>> aromanenko@gmail.com> wrote: >>> >>> Hello, >>> >>> I’m sorry if it’s already discussed somewhere but I find myself a >>> little bit lost in the subject. >>> So, I’d like to clarify this - what is a current official state of >>> Java 17 support at Beam? >>> >>> I recall that a great job was done to make Beam compatible with >>> Java 17 [1] and Beam already provides “beam_java17_sdk” Docker image [2] >>> but, iiuc, Java 8 is still the default JVM to run all Java tests on Jenkins >>> ("Java PreCommit" in the first order) and there are only limited number of >>> tests that are running with JDK 11 and 17 on Jenkins by dedicated jobs. >>> >>> So, my question would sound like if Beam officially supports Java >>> 17 (and 11), do we need to run all Beam Java SDK related tests (VR and IT >>> test including) against all supported Java SDKs? >>> >>> Do we still need to support Java 8 SDK? >>> >>> In the same time, as we are heading to move everything from Jenkins >>> to GitHub actions, what
Re: [DISCUSSION][JAVA] Current state of Java 17 support
We do support JDK8, JDK11 and JDK17. Our story around newer features within JDKs 9+ like modules is mostly non-existent though. We rarely run into JDK specific issues, the latest were the TLS1 and TLS1.1 deprecation in newer patch versions of the JDK and also the docker cpu share issues with different JDK versions. Even though it would be nice to cover more, we currently have too many flaky tests and an already busy Jenkins cluster. I believe we would get a lot more value out of deflaking our existing tests and re-enabling disabled tests. I got to give credit to the JDK folks for how well they have maintained compatibility over the years. On Thu, Dec 1, 2022 at 9:05 AM Sachin Agarwal via dev wrote: > This is a good heads up, thank you Cristian. > > On Thu, Dec 1, 2022 at 8:13 AM Cristian Constantinescu > wrote: > >> Hi, >> >> I came across some Kafka info and would like to share for those >> unaware. Kafka is planning to drop support for Java 8 in Kafka 4 (Java >> 8 is deprecated in Kafka 3), see KIP-750 [1]. >> >> I'm not sure when Kafka 4 is scheduled to be released (probably a few >> years down the road), but when it happens, KafkaIO may not be able to >> support it if we maintain Java 8 compatibility unless it remains on >> Kafka 3. >> >> Anyways, if not already done, I think it's a good idea to start >> putting up serious warning flags around Beam used with Java 8, even >> for Google cloud customers ;) >> >> Cheers, >> Cristian >> >> [1] https://issues.apache.org/jira/browse/KAFKA-12894 >> >> On Wed, Nov 30, 2022 at 12:59 PM Kenneth Knowles wrote: >> > >> > An important thing is to ensure that we do not accidentally depend on >> something that would break Java 8 support. >> > >> > Currently our Java 11 and 17 tests build the code with Java 8 (just >> like our released artifacts) and then compile and run the test code with >> the newer JDK. This roughly matches the user scenario, I think. So it is a >> little more complex than just having separate test runs for different JDK >> versions. But it would be good to make this more symmetrical between JDK >> versions to develop the mindset that JDK is always explicit. >> > >> > Kenn >> > >> > On Wed, Nov 30, 2022 at 9:48 AM Alexey Romanenko < >> aromanenko@gmail.com> wrote: >> >> >> >> >> >> On 30 Nov 2022, at 03:56, Tomo Suzuki via dev >> wrote: >> >> >> >> > Do we still need to support Java 8 SDK? >> >> >> >> Yes, for Google Cloud customers who still use Java 8, I want Apache >> Beam to support Java 8. Do you observe any special burden maintaining Java >> 8? >> >> >> >> >> >> I can only think of the additional resources costs if we will test all >> supported JDKs, as Austin mentioned above. Imho, we should do that for all >> JDK that are officially supported. >> >> Another less-costly way is to run the Java tests for all JDKs only >> during the release preparation stage. >> >> >> >> I agree that it would make sense to continue to support Java 8 until a >> significant number of users are using it. >> >> >> >> — >> >> Alexey >> >> >> >> >> >> >> >> Regards, >> >> Tomo >> >> >> >> On Tue, Nov 29, 2022 at 21:48 Austin Bennett >> wrote: >> >>> >> >>> -1 for ongoing Java8 support [ or, said another way, +1 for dropping >> support of Java8 ] >> >>> >> >>> +1 for having tests that run for ANY JDK that we say we support. Is >> there any reason the resources to support are too costly [ or outweigh the >> benefits of additional confidence in ensuring we support what we say we do >> ]? I am not certain on whether this would only be critical for releases, >> or should be done as part of regular CI. >> >>> >> >>> On Tue, Nov 29, 2022 at 8:51 AM Alexey Romanenko < >> aromanenko@gmail.com> wrote: >> >> Hello, >> >> I’m sorry if it’s already discussed somewhere but I find myself a >> little bit lost in the subject. >> So, I’d like to clarify this - what is a current official state of >> Java 17 support at Beam? >> >> I recall that a great job was done to make Beam compatible with Java >> 17 [1] and Beam already provides “beam_java17_sdk” Docker image [2] but, >> iiuc, Java 8 is still the default JVM to run all Java tests on Jenkins >> ("Java PreCommit" in the first order) and there are only limited number of >> tests that are running with JDK 11 and 17 on Jenkins by dedicated jobs. >> >> So, my question would sound like if Beam officially supports Java 17 >> (and 11), do we need to run all Beam Java SDK related tests (VR and IT test >> including) against all supported Java SDKs? >> >> Do we still need to support Java 8 SDK? >> >> In the same time, as we are heading to move everything from Jenkins >> to GitHub actions, what would be the default JDK there or we will run all >> Java-related actions against all supported JDKs? >> >> — >> Alexey >> >> [1] https://issues.apache.org/jira/browse/BEAM-12240 >> [2] https://hub.docker.com/r/apache/beam_java17_sdk >>
Re: [PROPOSAL] Preparing for Apache Beam 2.43.0 Release
Just an update that the branch is cut. There are 8 issues targeted to the release milestone: https://github.com/apache/beam/milestone/7 (thanks Cham for the correct link!) Please help to close these out or triage them off the milestone. I will be looking at them now. Kenn On Thu, Nov 17, 2022 at 2:27 PM Chamikara Jayalath via dev < dev@beam.apache.org> wrote: > > Thanks Kenn. > BTW the correct milestone for the 2.44.0 release should be this one: > https://github.com/apache/beam/milestone/7 > > - Cham > > > On Thu, Nov 17, 2022 at 9:12 AM Ahmet Altay via dev > wrote: > >> Thank you Kenn! :) >> >> On Wed, Nov 16, 2022 at 12:45 PM Kenneth Knowles wrote: >> >>> Hi all, >>> >>> The 2.44.0 release cut is scheduled for Nov 30th [1]. I'd like to >>> volunteer to do this release. >>> >>> As usual, my plan would be to cut right on that date and cherry >>> pick critical fixes. >>> >>> Help me and the release by: >>> - Making sure that any unresolved release blocking issues for 2.44.0 >>> have their "Milestone" marked as "2.44.0 Release" [2]. >>> - Reviewing the current release blockers [2] and removing the Milestone >>> if they don't meet the criteria at [3]. >>> >>> Kenn >>> >>> [1] >>> https://calendar.google.com/calendar/u/0/embed?src=0p73sl034k80oob7seouani...@group.calendar.google.com >>> [2] https://github.com/apache/beam/milestone/5 >>> [3] https://beam.apache.org/contribute/release-blocking/ >>> >>> Kenn >>> >>
Re: Credentials Rotation Failure on IO-Datastores cluster
Update - the job has been successfully run and the permanent fix is merged. I'll follow up with a PR to fix the links in the failure email. On Thu, Dec 1, 2022 at 2:00 PM Apache Jenkins Server < jenk...@builds.apache.org> wrote: > Something went wrong during the automatic credentials rotation for > IO-Datastores Cluster, performed at Thu Dec 01 18:58:27 UTC 2022. It may be > necessary to check the state of the cluster certificates. For further > details refer to the following links: > * https://ci-beam.apache.org/job/beam_SeedJob/ > * https://ci-beam.apache.org/.
Credentials Rotation Failure on IO-Datastores cluster
Something went wrong during the automatic credentials rotation for IO-Datastores Cluster, performed at Thu Dec 01 18:58:27 UTC 2022. It may be necessary to check the state of the cluster certificates. For further details refer to the following links: * https://ci-beam.apache.org/job/beam_SeedJob/ * https://ci-beam.apache.org/.
Credentials Rotation Failure on IO-Datastores cluster
Something went wrong during the automatic credentials rotation for IO-Datastores Cluster, performed at Thu Dec 01 18:58:27 UTC 2022. It may be necessary to check the state of the cluster certificates. For further details refer to the following links: * https://ci-beam.apache.org/job/beam_SeedJob/ * https://ci-beam.apache.org/.
Re: Credentials Rotation Failure on IO-Datastores cluster
Does that have potential to break other things? We could presumably also update https://github.com/apache/beam/blob/4718cdff87fed4f92636e94dbf3a04c2315d6a95/.test-infra/jenkins/job_IODatastoresCredentialsRotation.groovy#L38 to pool-1 instead. I put up https://github.com/apache/beam/pull/24466 in case that is preferable. On Thu, Dec 1, 2022 at 1:29 PM Yi Hu wrote: > Thanks for reporting. I have bumped the pool size of io-datastore as we > have more tests being added and the default-pool frequently becomes > unschedulable due to memory constraints. A simple fix is just rename the > 'pool1' back to 'default-pool'. > > On Thu, Dec 1, 2022 at 1:26 PM Danny McCormick > wrote: > >> Yes, I was just starting to look into this. Looks like this is the result >> of this job failing - >> https://github.com/apache/beam/blob/ec2a07b38c1f640c62e7c3b96966f18b334a7ce9/.test-infra/jenkins/job_IODatastoresCredentialsRotation.groovy#L49 >> >> The error is: >> >> ``` >> >> *21:25:58* + gcloud container clusters upgrade io-datastores >> --node-pool=default-pool --zone=us-central1-a --quiet*21:25:59* ERROR: >> (gcloud.container.clusters.upgrade) No node pool found matching the name >> [default-pool]. >> >> ``` >> >> >> from >> https://ci-beam.apache.org/job/Rotate%20IO-Datastores%20Cluster%20Credentials/6/console >> >> >> It looks like there's been some change to the cluster that is causing the >> job to fail. If we don't fix this and rerun, the cluster's creds will >> expire (probably in like a monthish). I'm not sure what the impact of that >> would be, I think probably broken IO integration tests. >> >> @John Casey or @Yi Hu might >> know more about this, I think the cluster in question is >> https://pantheon.corp.google.com/kubernetes/clusters/details/us-central1-a/io-datastores/details?mods=dataflow_dev=apache-beam-testing >> >> Next steps are: >> 1) figuring out why there's no longer a default-pool >> 2) Either recreating it or modifying the cred rotation logic >> 3) (Minor) Fixing the url in the Jenkins job so it actually points to the >> failing job when we get emails like this >> >> On Thu, Dec 1, 2022 at 1:18 PM Byron Ellis via dev >> wrote: >> >>> Is there something we need to do here? >>> >>> On Thu, Dec 1, 2022 at 10:10 AM Apache Jenkins Server < >>> jenk...@builds.apache.org> wrote: >>> Something went wrong during the automatic credentials rotation for IO-Datastores Cluster, performed at Thu Dec 01 15:00:47 UTC 2022. It may be necessary to check the state of the cluster certificates. For further details refer to the following links: * https://ci-beam.apache.org/job/beam_SeedJob_Standalone/ * https://ci-beam.apache.org/. >>> >>>
Re: Credentials Rotation Failure on IO-Datastores cluster
Thanks for reporting. I have bumped the pool size of io-datastore as we have more tests being added and the default-pool frequently becomes unschedulable due to memory constraints. A simple fix is just rename the 'pool1' back to 'default-pool'. On Thu, Dec 1, 2022 at 1:26 PM Danny McCormick wrote: > Yes, I was just starting to look into this. Looks like this is the result > of this job failing - > https://github.com/apache/beam/blob/ec2a07b38c1f640c62e7c3b96966f18b334a7ce9/.test-infra/jenkins/job_IODatastoresCredentialsRotation.groovy#L49 > > The error is: > > ``` > > *21:25:58* + gcloud container clusters upgrade io-datastores > --node-pool=default-pool --zone=us-central1-a --quiet*21:25:59* ERROR: > (gcloud.container.clusters.upgrade) No node pool found matching the name > [default-pool]. > > ``` > > > from > https://ci-beam.apache.org/job/Rotate%20IO-Datastores%20Cluster%20Credentials/6/console > > > It looks like there's been some change to the cluster that is causing the > job to fail. If we don't fix this and rerun, the cluster's creds will > expire (probably in like a monthish). I'm not sure what the impact of that > would be, I think probably broken IO integration tests. > > @John Casey or @Yi Hu might > know more about this, I think the cluster in question is > https://pantheon.corp.google.com/kubernetes/clusters/details/us-central1-a/io-datastores/details?mods=dataflow_dev=apache-beam-testing > > Next steps are: > 1) figuring out why there's no longer a default-pool > 2) Either recreating it or modifying the cred rotation logic > 3) (Minor) Fixing the url in the Jenkins job so it actually points to the > failing job when we get emails like this > > On Thu, Dec 1, 2022 at 1:18 PM Byron Ellis via dev > wrote: > >> Is there something we need to do here? >> >> On Thu, Dec 1, 2022 at 10:10 AM Apache Jenkins Server < >> jenk...@builds.apache.org> wrote: >> >>> Something went wrong during the automatic credentials rotation for >>> IO-Datastores Cluster, performed at Thu Dec 01 15:00:47 UTC 2022. It may be >>> necessary to check the state of the cluster certificates. For further >>> details refer to the following links: >>> * https://ci-beam.apache.org/job/beam_SeedJob_Standalone/ >>> * https://ci-beam.apache.org/. >> >>
Re: Credentials Rotation Failure on IO-Datastores cluster
Yes, I was just starting to look into this. Looks like this is the result of this job failing - https://github.com/apache/beam/blob/ec2a07b38c1f640c62e7c3b96966f18b334a7ce9/.test-infra/jenkins/job_IODatastoresCredentialsRotation.groovy#L49 The error is: ``` *21:25:58* + gcloud container clusters upgrade io-datastores --node-pool=default-pool --zone=us-central1-a --quiet*21:25:59* ERROR: (gcloud.container.clusters.upgrade) No node pool found matching the name [default-pool]. ``` from https://ci-beam.apache.org/job/Rotate%20IO-Datastores%20Cluster%20Credentials/6/console It looks like there's been some change to the cluster that is causing the job to fail. If we don't fix this and rerun, the cluster's creds will expire (probably in like a monthish). I'm not sure what the impact of that would be, I think probably broken IO integration tests. @John Casey or @Yi Hu might know more about this, I think the cluster in question is https://pantheon.corp.google.com/kubernetes/clusters/details/us-central1-a/io-datastores/details?mods=dataflow_dev=apache-beam-testing Next steps are: 1) figuring out why there's no longer a default-pool 2) Either recreating it or modifying the cred rotation logic 3) (Minor) Fixing the url in the Jenkins job so it actually points to the failing job when we get emails like this On Thu, Dec 1, 2022 at 1:18 PM Byron Ellis via dev wrote: > Is there something we need to do here? > > On Thu, Dec 1, 2022 at 10:10 AM Apache Jenkins Server < > jenk...@builds.apache.org> wrote: > >> Something went wrong during the automatic credentials rotation for >> IO-Datastores Cluster, performed at Thu Dec 01 15:00:47 UTC 2022. It may be >> necessary to check the state of the cluster certificates. For further >> details refer to the following links: >> * https://ci-beam.apache.org/job/beam_SeedJob_Standalone/ >> * https://ci-beam.apache.org/. > >
Re: Credentials Rotation Failure on IO-Datastores cluster
Is there something we need to do here? On Thu, Dec 1, 2022 at 10:10 AM Apache Jenkins Server < jenk...@builds.apache.org> wrote: > Something went wrong during the automatic credentials rotation for > IO-Datastores Cluster, performed at Thu Dec 01 15:00:47 UTC 2022. It may be > necessary to check the state of the cluster certificates. For further > details refer to the following links: > * https://ci-beam.apache.org/job/beam_SeedJob_Standalone/ > * https://ci-beam.apache.org/.
Credentials Rotation Failure on IO-Datastores cluster
Something went wrong during the automatic credentials rotation for IO-Datastores Cluster, performed at Thu Dec 01 15:00:47 UTC 2022. It may be necessary to check the state of the cluster certificates. For further details refer to the following links: * https://ci-beam.apache.org/job/beam_SeedJob_Standalone/ * https://ci-beam.apache.org/.
Beam Dependency Check Report (2022-12-01)
<<< text/html; charset=UTF-8: Unrecognized >>>
Re: [DISCUSSION][JAVA] Current state of Java 17 support
This is a good heads up, thank you Cristian. On Thu, Dec 1, 2022 at 8:13 AM Cristian Constantinescu wrote: > Hi, > > I came across some Kafka info and would like to share for those > unaware. Kafka is planning to drop support for Java 8 in Kafka 4 (Java > 8 is deprecated in Kafka 3), see KIP-750 [1]. > > I'm not sure when Kafka 4 is scheduled to be released (probably a few > years down the road), but when it happens, KafkaIO may not be able to > support it if we maintain Java 8 compatibility unless it remains on > Kafka 3. > > Anyways, if not already done, I think it's a good idea to start > putting up serious warning flags around Beam used with Java 8, even > for Google cloud customers ;) > > Cheers, > Cristian > > [1] https://issues.apache.org/jira/browse/KAFKA-12894 > > On Wed, Nov 30, 2022 at 12:59 PM Kenneth Knowles wrote: > > > > An important thing is to ensure that we do not accidentally depend on > something that would break Java 8 support. > > > > Currently our Java 11 and 17 tests build the code with Java 8 (just like > our released artifacts) and then compile and run the test code with the > newer JDK. This roughly matches the user scenario, I think. So it is a > little more complex than just having separate test runs for different JDK > versions. But it would be good to make this more symmetrical between JDK > versions to develop the mindset that JDK is always explicit. > > > > Kenn > > > > On Wed, Nov 30, 2022 at 9:48 AM Alexey Romanenko < > aromanenko@gmail.com> wrote: > >> > >> > >> On 30 Nov 2022, at 03:56, Tomo Suzuki via dev > wrote: > >> > >> > Do we still need to support Java 8 SDK? > >> > >> Yes, for Google Cloud customers who still use Java 8, I want Apache > Beam to support Java 8. Do you observe any special burden maintaining Java > 8? > >> > >> > >> I can only think of the additional resources costs if we will test all > supported JDKs, as Austin mentioned above. Imho, we should do that for all > JDK that are officially supported. > >> Another less-costly way is to run the Java tests for all JDKs only > during the release preparation stage. > >> > >> I agree that it would make sense to continue to support Java 8 until a > significant number of users are using it. > >> > >> — > >> Alexey > >> > >> > >> > >> Regards, > >> Tomo > >> > >> On Tue, Nov 29, 2022 at 21:48 Austin Bennett wrote: > >>> > >>> -1 for ongoing Java8 support [ or, said another way, +1 for dropping > support of Java8 ] > >>> > >>> +1 for having tests that run for ANY JDK that we say we support. Is > there any reason the resources to support are too costly [ or outweigh the > benefits of additional confidence in ensuring we support what we say we do > ]? I am not certain on whether this would only be critical for releases, > or should be done as part of regular CI. > >>> > >>> On Tue, Nov 29, 2022 at 8:51 AM Alexey Romanenko < > aromanenko@gmail.com> wrote: > > Hello, > > I’m sorry if it’s already discussed somewhere but I find myself a > little bit lost in the subject. > So, I’d like to clarify this - what is a current official state of > Java 17 support at Beam? > > I recall that a great job was done to make Beam compatible with Java > 17 [1] and Beam already provides “beam_java17_sdk” Docker image [2] but, > iiuc, Java 8 is still the default JVM to run all Java tests on Jenkins > ("Java PreCommit" in the first order) and there are only limited number of > tests that are running with JDK 11 and 17 on Jenkins by dedicated jobs. > > So, my question would sound like if Beam officially supports Java 17 > (and 11), do we need to run all Beam Java SDK related tests (VR and IT test > including) against all supported Java SDKs? > > Do we still need to support Java 8 SDK? > > In the same time, as we are heading to move everything from Jenkins > to GitHub actions, what would be the default JDK there or we will run all > Java-related actions against all supported JDKs? > > — > Alexey > > [1] https://issues.apache.org/jira/browse/BEAM-12240 > [2] https://hub.docker.com/r/apache/beam_java17_sdk > > > > >> -- > >> Regards, > >> Tomo > >> > >> >
Re: [DISCUSSION][JAVA] Current state of Java 17 support
Hi, I came across some Kafka info and would like to share for those unaware. Kafka is planning to drop support for Java 8 in Kafka 4 (Java 8 is deprecated in Kafka 3), see KIP-750 [1]. I'm not sure when Kafka 4 is scheduled to be released (probably a few years down the road), but when it happens, KafkaIO may not be able to support it if we maintain Java 8 compatibility unless it remains on Kafka 3. Anyways, if not already done, I think it's a good idea to start putting up serious warning flags around Beam used with Java 8, even for Google cloud customers ;) Cheers, Cristian [1] https://issues.apache.org/jira/browse/KAFKA-12894 On Wed, Nov 30, 2022 at 12:59 PM Kenneth Knowles wrote: > > An important thing is to ensure that we do not accidentally depend on > something that would break Java 8 support. > > Currently our Java 11 and 17 tests build the code with Java 8 (just like our > released artifacts) and then compile and run the test code with the newer > JDK. This roughly matches the user scenario, I think. So it is a little more > complex than just having separate test runs for different JDK versions. But > it would be good to make this more symmetrical between JDK versions to > develop the mindset that JDK is always explicit. > > Kenn > > On Wed, Nov 30, 2022 at 9:48 AM Alexey Romanenko > wrote: >> >> >> On 30 Nov 2022, at 03:56, Tomo Suzuki via dev wrote: >> >> > Do we still need to support Java 8 SDK? >> >> Yes, for Google Cloud customers who still use Java 8, I want Apache Beam to >> support Java 8. Do you observe any special burden maintaining Java 8? >> >> >> I can only think of the additional resources costs if we will test all >> supported JDKs, as Austin mentioned above. Imho, we should do that for all >> JDK that are officially supported. >> Another less-costly way is to run the Java tests for all JDKs only during >> the release preparation stage. >> >> I agree that it would make sense to continue to support Java 8 until a >> significant number of users are using it. >> >> — >> Alexey >> >> >> >> Regards, >> Tomo >> >> On Tue, Nov 29, 2022 at 21:48 Austin Bennett wrote: >>> >>> -1 for ongoing Java8 support [ or, said another way, +1 for dropping >>> support of Java8 ] >>> >>> +1 for having tests that run for ANY JDK that we say we support. Is there >>> any reason the resources to support are too costly [ or outweigh the >>> benefits of additional confidence in ensuring we support what we say we do >>> ]? I am not certain on whether this would only be critical for releases, >>> or should be done as part of regular CI. >>> >>> On Tue, Nov 29, 2022 at 8:51 AM Alexey Romanenko >>> wrote: Hello, I’m sorry if it’s already discussed somewhere but I find myself a little bit lost in the subject. So, I’d like to clarify this - what is a current official state of Java 17 support at Beam? I recall that a great job was done to make Beam compatible with Java 17 [1] and Beam already provides “beam_java17_sdk” Docker image [2] but, iiuc, Java 8 is still the default JVM to run all Java tests on Jenkins ("Java PreCommit" in the first order) and there are only limited number of tests that are running with JDK 11 and 17 on Jenkins by dedicated jobs. So, my question would sound like if Beam officially supports Java 17 (and 11), do we need to run all Beam Java SDK related tests (VR and IT test including) against all supported Java SDKs? Do we still need to support Java 8 SDK? In the same time, as we are heading to move everything from Jenkins to GitHub actions, what would be the default JDK there or we will run all Java-related actions against all supported JDKs? — Alexey [1] https://issues.apache.org/jira/browse/BEAM-12240 [2] https://hub.docker.com/r/apache/beam_java17_sdk >> -- >> Regards, >> Tomo >> >>
Beam High Priority Issue Report (62)
This is your daily summary of Beam's current high priority issues that may need attention. See https://beam.apache.org/contribute/issue-priorities for the meaning and expectations around issue priorities. Unassigned P1 Issues: https://github.com/apache/beam/issues/24389 [Failing Test]: HadoopFormatIOElasticTest.classMethod ExceptionInInitializerError ContainerFetchException https://github.com/apache/beam/issues/24384 [Bug]: RampupThrottlingFnTest.testRampupThrottler TooManyActualInvocations https://github.com/apache/beam/issues/24383 [Bug]: Daemon will be stopped at the end of the build after the daemon was no longer found in the daemon registry https://github.com/apache/beam/issues/24374 [Bug]: Fail to retrieve rowcount for first arrow chunk: null. https://github.com/apache/beam/issues/24367 [Bug]: workflow.tar.gz cannot be passed to flink runner https://github.com/apache/beam/issues/24313 [Flaky]: apache_beam/runners/portability/portable_runner_test.py::PortableRunnerTestWithSubprocesses::test_pardo_state_with_custom_key_coder https://github.com/apache/beam/issues/24267 [Failing Test]: Timeout waiting to lock gradle https://github.com/apache/beam/issues/24263 [Bug]: Remote call on apache-beam-jenkins-3 failed. The channel is closing down or has closed down https://github.com/apache/beam/issues/23944 beam_PreCommit_Python_Cron regularily failing - test_pardo_large_input flaky https://github.com/apache/beam/issues/23745 [Bug]: Samza AsyncDoFnRunnerTest.testSimplePipeline is flaky https://github.com/apache/beam/issues/23709 [Flake]: Spark batch flakes in ParDoLifecycleTest.testTeardownCalledAfterExceptionInProcessElement and ParDoLifecycleTest.testTeardownCalledAfterExceptionInStartBundle https://github.com/apache/beam/issues/22969 Discrepancy in behavior of `DoFn.process()` when `yield` is combined with `return` statement, or vice versa https://github.com/apache/beam/issues/22913 [Bug]: beam_PostCommit_Java_ValidatesRunner_Flink is flakes in org.apache.beam.sdk.transforms.GroupByKeyTest$BasicTests.testAfterProcessingTimeContinuationTriggerUsingState https://github.com/apache/beam/issues/22321 PortableRunnerTestWithExternalEnv.test_pardo_large_input is regularly failing on jenkins https://github.com/apache/beam/issues/21713 404s in BigQueryIO don't get output to Failed Inserts PCollection https://github.com/apache/beam/issues/21561 ExternalPythonTransformTest.trivialPythonTransform flaky https://github.com/apache/beam/issues/21480 flake: FlinkRunnerTest.testEnsureStdoutStdErrIsRestored https://github.com/apache/beam/issues/21469 beam_PostCommit_XVR_Flink flaky: Connection refused https://github.com/apache/beam/issues/21462 Flake in org.apache.beam.sdk.io.mqtt.MqttIOTest.testReadObject: Address already in use https://github.com/apache/beam/issues/21261 org.apache.beam.runners.dataflow.worker.fn.logging.BeamFnLoggingServiceTest.testMultipleClientsFailingIsHandledGracefullyByServer is flaky https://github.com/apache/beam/issues/21260 Python DirectRunner does not emit data at GC time https://github.com/apache/beam/issues/21121 apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT.test_streaming_wordcount_it flakey https://github.com/apache/beam/issues/21113 testTwoTimersSettingEachOtherWithCreateAsInputBounded flaky https://github.com/apache/beam/issues/20976 apache_beam.runners.portability.flink_runner_test.FlinkRunnerTestOptimized.test_flink_metrics is flaky https://github.com/apache/beam/issues/20975 org.apache.beam.runners.flink.ReadSourcePortableTest.testExecution[streaming: false] is flaky https://github.com/apache/beam/issues/20974 Python GHA PreCommits flake with grpc.FutureTimeoutError on SDK harness startup https://github.com/apache/beam/issues/20689 Kafka commitOffsetsInFinalize OOM on Flink https://github.com/apache/beam/issues/20108 Python direct runner doesn't emit empty pane when it should https://github.com/apache/beam/issues/19814 Flink streaming flakes in ParDoLifecycleTest.testTeardownCalledAfterExceptionInStartBundleStateful and ParDoLifecycleTest.testTeardownCalledAfterExceptionInProcessElementStateful https://github.com/apache/beam/issues/19734 WatchTest.testMultiplePollsWithManyResults flake: Outputs must be in timestamp order (sickbayed) https://github.com/apache/beam/issues/19465 Explore possibilities to lower in-use IP address quota footprint. https://github.com/apache/beam/issues/19241 Python Dataflow integration tests should export the pipeline Job ID and console output to Jenkins Test Result section P1 Issues with no update in the last week: https://github.com/apache/beam/issues/24100 [Bug]: `Filter.whereFieldName` appears in docs but not available https://github.com/apache/beam/issues/23906 [Bug]: Dataflow jpms tests fail on the 2.43.0 release branch https://github.com/apache/beam/issues/23875 [Bug]: beam.Row.__eq__ returns true for unequal rows https://github.com/apache/beam/issues/23525 [Bug]: Default