pabloem commented on a change in pull request #11086: [BEAM-8910] Make custom 
BQ source read from Avro
URL: https://github.com/apache/beam/pull/11086#discussion_r409021250
 
 

 ##########
 File path: sdks/python/apache_beam/io/gcp/bigquery.py
 ##########
 @@ -1570,6 +1584,10 @@ class _ReadFromBigQuery(PTransform):
       bucket where the extracted table should be written as a string or
       a :class:`~apache_beam.options.value_provider.ValueProvider`. If
       :data:`None`, then the temp_location parameter is used.
+    backwards_compatible_data_format (bool): By default, this transform reads
+      data in native Python types that come from Avro-exports of BigQuery. By
+      setting this flag to True, the transform will return JSON-types, like
+      the older BigQuerySource.
 
 Review comment:
   I've added a link to 
https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-avro#avro_conversions
 for details on the conversions. I've also rephrased the doc. LMK what you 
think.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


With regards,
Apache Git Services

Reply via email to