I think you really need a peculiar reason to force streamingInsert in a
batch job. In batch mode you. Note that you will quickly hit the quota
limit in batch mode: "
Maximum rows per second: 100,000 rows per second, per project", as in batch
load you can process a lot more information in a shorter time.

I know you can force a batch mode in streaming mode, I don't know for the
other way around.

_/
_/ Alex Van Boxel


On Tue, May 7, 2019 at 6:58 PM Andres Angel <ingenieroandresan...@gmail.com>
wrote:

> Hello everyone,
>
> I need to use BigQuery inserts within my beam pipeline, hence I know well
> the built-in IO options offer `BigQueryIO`, however this will insert in a
> batch fashion to BQ creating underneath a BQ load job. I instead need to
> trigger a streaming insert into BQ, and I was reviewing the Java SDK
> documentation but seems like this is not possible.
>
> In the other hand, I have the python SDK and I found this GitHub
> documentation
> <https://beam.apache.org/releases/pydoc/2.4.0/_modules/apache_beam/io/gcp/bigquery.html>
> code where they are using a method *InsertAll
> <https://github.com/apache/beam/blob/master/sdks/python/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py>*
>  which
> is apparently what I need. If this is official I would like to know if
> there is a naive fashion to trigger stream inserts in BQ using the Java SDK.
>
> thanks so much for your feedback
> AU
>

Reply via email to