[ https://issues.apache.org/jira/browse/SPARK-36053?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17377359#comment-17377359 ]
Apache Spark commented on SPARK-36053: -------------------------------------- User 'LuciferYang' has created a pull request for this issue: https://github.com/apache/spark/pull/33267 > Unify write exception and delete abnormal disk block object file process > ------------------------------------------------------------------------ > > Key: SPARK-36053 > URL: https://issues.apache.org/jira/browse/SPARK-36053 > Project: Spark > Issue Type: Improvement > Components: Spark Core > Affects Versions: 3.3.0 > Reporter: Yang Jie > Priority: Minor > > There are some duplicate codes related to cleaning up failed files after > DiskBlockObjectWriter writes data abnormally in > `BypassMergeSortShuffleWriter`, `ExternalAppendOnlyMap` and `ExternalSorter`, > the duplicate codes as follows: > {code:java} > writer.revertPartialWritesAndClose() > if (file.exists()) { > if (!file.delete()) { > logWarning(s"Error deleting ${file}") > } > } > {code} -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org