I like the approach, and the doc! I've added some questions/go specific
nits and questions.
I'm lifting one question here for information gathering:
Do streaming SDFs in Java/Python require event times be emitted with every
element, or is it defaulted to past End of GlobalWindow?
On Thu, Mar 31,
Hello guys,
I am trying to find a example how to define deterministic coder for namedtuple
but fail to find any. In this official doc
https://beam.apache.org/documentation/sdks/python-type-safety/#kinds-of-type-hints,
it says code example is included for the PlayerCoder, but there is actually n
This is your daily summary of Beam's current flaky tests
(https://issues.apache.org/jira/issues/?jql=project%20%3D%20BEAM%20AND%20statusCategory%20!%3D%20Done%20AND%20labels%20%3D%20flake)
These are P1 issues because they have a major negative impact on the community
and make it hard to determin
This is your daily summary of Beam's current P1 issues, not including flaky
tests
(https://issues.apache.org/jira/issues/?jql=project%20%3D%20BEAM%20AND%20statusCategory%20!%3D%20Done%20AND%20priority%20%3D%20P1%20AND%20(labels%20is%20EMPTY%20OR%20labels%20!%3D%20flake).
See https://beam.apache.
> On 31 Mar 2022, at 18:02, Robert Bradshaw wrote:
>
> Generally makes sense to me, though I'm curious what the maintenance
> burden is *high or low) in keeping it around.
Well, we need to provide two versions of spark runner artifacts, job-servers
and docker images, to test them separately (
Hi, community!
Our team is working on the new CdapIO connector implementation and we prepared
initial PRs for components of the CdapIO package. We are reviewing the design
document [1] with members of CDAP, and working towards resolving comments and
having CDAP plugin artifacts published to Ma
+1 for me to drop Spark 2.x support.
Users who want to still use Spark 2.x can be back on a previous Beam release.
Regards
JB
On Thu, Mar 31, 2022 at 5:51 PM Alexey Romanenko
wrote:
>
> Hi everyone,
>
> For the moment, Beam Spark Runner supports two versions of Spark - 2.x and
> 3.x.
>
> Takin
Hi everyone,
For the moment, Beam Spark Runner supports two versions of Spark - 2.x and 3.x.
Taking into account the several things that:
- almost all cloud providers already mostly moved to Spark 3.x as a main
supported version;
- the latest Spark 2.x release (Spark 2.4.8, maintenance release)