[ https://issues.apache.org/jira/browse/SPARK-38976?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17527865#comment-17527865 ]
Hyukjin Kwon commented on SPARK-38976: -------------------------------------- [~wesharn] I think it's best to interact in dev mailing list first. Let's file an issue if it's confirmed > spark-sql. overwrite. hive table-duplicate records > -------------------------------------------------- > > Key: SPARK-38976 > URL: https://issues.apache.org/jira/browse/SPARK-38976 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 3.2.1 > Reporter: wesharn > Priority: Major > > It occured duplicate records when spark-sql overwrite hive table . when > spark job has failure stages,but dateframe has any duplicate id? when I run > the job again, the reasult is correct.It confused me.why? -- This message was sent by Atlassian Jira (v8.20.7#820007) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org