[ 
https://issues.apache.org/jira/browse/BEAM-14273?focusedWorklogId=762464&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-762464
 ]

ASF GitHub Bot logged work on BEAM-14273:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 26/Apr/22 18:11
            Start Date: 26/Apr/22 18:11
    Worklog Time Spent: 10m 
      Work Description: pabloem commented on code in PR #17431:
URL: https://github.com/apache/beam/pull/17431#discussion_r859011889


##########
sdks/python/apache_beam/io/gcp/bigquery.py:
##########
@@ -2190,6 +2190,12 @@ def expand(self, pcoll):
               'A schema must be provided when writing to BigQuery using '
               'Avro based file loads')
 
+      if self.schema and 'JSON' in str(self.schema):

Review Comment:
   can you add a unit test for this particular behavior please?





Issue Time Tracking
-------------------

    Worklog Id:     (was: 762464)
    Time Spent: 3h  (was: 2h 50m)

> Update BQ connector to support new JSON type (Python)
> -----------------------------------------------------
>
>                 Key: BEAM-14273
>                 URL: https://issues.apache.org/jira/browse/BEAM-14273
>             Project: Beam
>          Issue Type: New Feature
>          Components: io-py-gcp
>            Reporter: Ahmed Abualsaud
>            Assignee: Ahmed Abualsaud
>            Priority: P2
>          Time Spent: 3h
>  Remaining Estimate: 0h
>
> BQ has a new JSON type that is defined here: 
> [https://cloud.google.com/bigquery/docs/reference/standard-sql/data-types#json_type]
> We should update Beam BQ Java and Python connectors to support that for 
> various read methods (export jobs, storage API) and write methods (load jobs, 
> streaming inserts, storage API).
> We should also add integration tests that exercise reading from /writing to 
> BQ tables with columns that has JSON type.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

Reply via email to