> https://cloud.google.com/bigquery/streaming-data-into-bigquery#streaming_into_partitioned_tables
>
>
>
>
>
> *From: *Reuven Lax
> *Reply-To: *"dev@beam.apache.org"
> *Date: *Wednesday, 14 November 2018 at 14:51
> *To: *"dev@beam.apache.org"
&
/bigquery/streaming-data-into-bigquery#streaming_into_partitioned_tables
From: Reuven Lax
Reply-To: "dev@beam.apache.org"
Date: Wednesday, 14 November 2018 at 14:51
To: "dev@beam.apache.org"
Subject: Re: Bigquery streaming TableRow size limit
Generally I would agree, but t
Generally I would agree, but the consequences here of a mistake are severe.
Not only will the beam pipeline get stuck for 24 hours, _anything_ else in
the user's GCP project that tries to load data into BigQuery will also fail
for the next 24 hours. Given the severity, I think it's best to make
I would rather not have the builder method and run into the quota issue
then require the builder method and still run into quota issues.
On Mon, Nov 12, 2018 at 5:25 PM Reuven Lax wrote:
> I'm a bit worried about making this automatic, as it can have unexpected
> side effects on BigQuery
I'm a bit worried about making this automatic, as it can have unexpected
side effects on BigQuery load-job quota. This is a 24-hour quota, so if
it's accidentally exceeded all load jobs for the project may be blocked for
the next 24 hours. However if the user opts in (possibly via .a builder
Having data ingestion work without needing to worry about how big the blobs
are would be nice if it was automatic for users.
On Mon, Nov 12, 2018 at 1:03 AM Wout Scheepers <
wout.scheep...@vente-exclusive.com> wrote:
> Hey all,
>
>
>
> The TableRow size limit is 1mb when streaming into bigquery.
Hey all,
The TableRow size limit is 1mb when streaming into bigquery.
To prevent data loss, I’m going to implement a TableRow size check and add a
fan out to do a bigquery load job in case the size is above the limit.
Of course this load job would be windowed.
I know it doesn’t make sense to