[ 
https://issues.apache.org/jira/browse/SPARK-13601?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15174433#comment-15174433
 ] 

Apache Spark commented on SPARK-13601:
--------------------------------------

User 'davies' has created a pull request for this issue:
https://github.com/apache/spark/pull/11450

> Invoke task failure callbacks before calling outputstream.close()
> -----------------------------------------------------------------
>
>                 Key: SPARK-13601
>                 URL: https://issues.apache.org/jira/browse/SPARK-13601
>             Project: Spark
>          Issue Type: Improvement
>          Components: Spark Core
>            Reporter: Davies Liu
>            Assignee: Davies Liu
>
> We need to submit another PR against Spark to call the task failure callbacks 
> before Spark calls the close function on various output streams.
> For example, we need to intercept an exception and call 
> TaskContext.markTaskFailed before calling close in the following code (in 
> PairRDDFunctions.scala):
> {code}
>       Utils.tryWithSafeFinally {
>         while (iter.hasNext) {
>           val record = iter.next()
>           writer.write(record._1.asInstanceOf[AnyRef], 
> record._2.asInstanceOf[AnyRef])
>           // Update bytes written metric every few records
>           maybeUpdateOutputMetrics(outputMetricsAndBytesWrittenCallback, 
> recordsWritten)
>           recordsWritten += 1
>         }
>       } {
>         writer.close()
>       }
> {code}
> Changes to Spark should include unit tests to make sure this always work in 
> the future.



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to