[ 
https://issues.apache.org/jira/browse/HADOOP-17066?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Steve Loughran resolved HADOOP-17066.
-------------------------------------
    Resolution: Duplicate

HADOOP-17318 covers the duplicate job problem everywhere in the committer and, 
combined with a change in spark, should go away.

This is an intermittent issue as it depends on the timing you launch stages 
and, for task staging dir conflict, whether two tasks attempts of conflicting 
jobs are launched at the same time.

> S3A staging committer committing duplicate files
> ------------------------------------------------
>
>                 Key: HADOOP-17066
>                 URL: https://issues.apache.org/jira/browse/HADOOP-17066
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs/s3
>    Affects Versions: 3.3.0, 3.2.1, 3.1.3
>            Reporter: Steve Loughran
>            Assignee: Steve Loughran
>            Priority: Major
>
> SPARK-39111 reporting concurrent jobs double writing files.



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-dev-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-dev-h...@hadoop.apache.org

Reply via email to