This is not really designed to be used from batch pipelines. From batch, you will definitely overwhelm BQ's quota, causing all sorts of problems.
On Mon, Jul 1, 2019 at 10:54 AM Mikhail Gryzykhin <[email protected]> wrote: > Hello everybody, > > This question is regarding user post on StackOverflow > <https://stackoverflow.com/questions/56823629/gcp-dataflow-running-streaming-inserts-into-bigquery-gc-thrashing> > . > > My understanding of problem is that setting .withMethod(STREAMING_INSERTS) > on BigQueryIO sink causes GC thrashing on big amount of entries. > > Is there a known issue or information how to start triaging this? > > Search on Jira shown me this ticket, but it is not directly connected with > the issue: https://issues.apache.org/jira/browse/BEAM-7666 > > Thank you, > Mikhail. >
