[ https://issues.apache.org/jira/browse/BEAM-5510?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Ahmet Altay reassigned BEAM-5510: --------------------------------- Assignee: Chamikara Jayalath (was: Ahmet Altay) > Records including datetime to be saved as DATETIME or TIMESTAMP in BigQuery > --------------------------------------------------------------------------- > > Key: BEAM-5510 > URL: https://issues.apache.org/jira/browse/BEAM-5510 > Project: Beam > Issue Type: Bug > Components: sdk-py-core > Affects Versions: 2.6.0 > Reporter: Pascal Gula > Assignee: Chamikara Jayalath > Priority: Major > > When trying to write some row in BigQuery that include a python datetime > object, the marshaling used to save a row in BigQuery is impossible. > {code:java} > File > "/home/pascal/Wks/GitHub/PEAT-AI/Albatros/venv/local/lib/python2.7/site-packages/apache_beam/internal/gcp/json_value.py", > line 124, in to_json_value > raise TypeError('Cannot convert %s to a JSON value.' % repr(obj)) > TypeError: Cannot convert datetime.datetime(2018, 9, 25, 18, 57, 18, 108579) > to a JSON value. [while running 'save/WriteToBigQuery'] > {code} > However, this is something perfectly feasible, as `google-cloud-python` > supports it since this issue has been solved: > [https://github.com/GoogleCloudPlatform/google-cloud-python/issues/2957] > thanks to this pull request: > [https://github.com/GoogleCloudPlatform/google-cloud-python/pull/3426/files] > As similar approach could be taken for the `json_value.py` helper. > Is there any workaround that can be applied to solve this issue? > -- This message was sent by Atlassian JIRA (v7.6.3#76005)