Hi,

We run multiple streaming pipelines using cloud dataflow that read from Kafka 
and write to BigQuery. We don't mind a few hours delay and are thinking of 
avoiding the costs associated with streaming data into BigQuery. Is there 
already a support (or a future plan) for such a scenario? If not then I guess I 
will implement one of the following option:
* A BoundedSource implementation for Kafka so that we can run this in batch 
mode.
* The streaming job writes to GCS and then a BQ load job writes to BigQuery.

Thanks!

Reply via email to