Github user rezasafi commented on the issue: https://github.com/apache/spark/pull/19388 Sorry for the delay. It seems that to be able to commit the same rdd in different stages we need to use stageId. So the jobId and other configurations in the write method of SparkHadoopWriter should be based on the stageId of the rdd and not the rddId. I have a hacky solution for this, but I am working on a better one and will update this PR ASAP.
--- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org