ng-oliver opened a new issue, #35329: URL: https://github.com/apache/beam/issues/35329
https://github.com/apache/beam/blob/297ba99e96ed65ee154a8c4327023dd9a3c31f98/sdks/python/apache_beam/io/gcp/bigquery.py#L2555 Currently one can configure (say clustering and partition) for `create_depositions` when method is `STREAMING_INSERTS` or `FILE_LOADS` via the `additional_bq_parameters` argument. However this argument is absent when method = `STORAGE_WRITE_API`. This leads to an awkward situation that the dataflow job can have dynamic destination (which is great) but none of the destinations are partitioned nor clustered upon table creation (which is not great). One will then need to manually delete the created tables and run a DDL statement to re-create the table with partition and clustering. This has eliminated the benefit of letting dataflow to create destination dynamically. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: github-unsubscr...@beam.apache.org.apache.org For queries about this service, please contact Infrastructure at: us...@infra.apache.org