[
https://issues.apache.org/jira/browse/BEAM-7300?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17122818#comment-17122818
]
Beam JIRA Bot commented on BEAM-7300:
-------------------------------------
This issue is P2 but has been unassigned without any comment for 60 days so it
has been labeled "stale-P2". If this issue is still affecting you, we care!
Please comment and remove the label. Otherwise, in 14 days the issue will be
moved to P3.
Please see https://beam.apache.org/contribute/jira-priorities/ for a detailed
explanation of what these priorities mean.
> Writing in batch to BigQuery may lead to data loss
> --------------------------------------------------
>
> Key: BEAM-7300
> URL: https://issues.apache.org/jira/browse/BEAM-7300
> Project: Beam
> Issue Type: Improvement
> Components: io-java-gcp
> Affects Versions: 2.12.0
> Reporter: Evgeny
> Priority: P2
> Labels: stale-P2
>
> There is no way to avoid data loss during batch insert into BigQuery when one
> of the data rows is corrupted.
--
This message was sent by Atlassian Jira
(v8.3.4#803005)