Dataloss Bug in BigQuery IO Storage Write when used in Batch

2023-05-03 Thread John Casey via user
Hi All, Per https://github.com/apache/beam/issues/26521 and https://github.com/apache/beam/issues/26520, there is an issue in Beam versions 2.33 - 2.47 where data can be lost when using the Storage Write API in Batch. This issue is much more likely to occur in versions 2.44-2.47. The bugs

Beam Summit Program is here!

2023-05-03 Thread Carolina Escobar
We’re very excited to announce the curated program we have prepared for you! Get to know our speakers! [image: ADAF9B7A-65F1-4C24-BD3D-2D63F85B53FD.png] Take a quick peek at our program: - Beam at Talend - the long road from incubator project to cloud-based Pipeline Designer tool

Re: Loosing records when using BigQuery IO Connector

2023-05-03 Thread XQ Hu via user
https://github.com/apache/beam/issues/26515 tracks this issue. The fix was merged. Thanks a lot for reporting this issue, Binh! On Mon, Apr 17, 2023 at 12:58 PM Binh Nguyen Van wrote: > Hi, > > I tested with streaming insert and file load, and they all worked as > expected. But looks like