Repository: spark Updated Branches: refs/heads/master 0cf59fcbe -> 7696b9de0
[SPARK-20538][SQL] Wrap Dataset.reduce with withNewRddExecutionId. ## What changes were proposed in this pull request? Wrap Dataset.reduce with `withNewExecutionId`. Author: Soham Aurangabadkar <soha...@gmail.com> Closes #21316 from sohama4/dataset_reduce_withexecutionid. Project: http://git-wip-us.apache.org/repos/asf/spark/repo Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/7696b9de Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/7696b9de Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/7696b9de Branch: refs/heads/master Commit: 7696b9de0df6e9eb85a74bdb404409da693cf65e Parents: 0cf59fc Author: Soham Aurangabadkar <soha...@gmail.com> Authored: Fri May 18 10:29:34 2018 -0700 Committer: Shixiong Zhu <zsxw...@gmail.com> Committed: Fri May 18 10:29:34 2018 -0700 ---------------------------------------------------------------------- sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala | 4 +++- 1 file changed, 3 insertions(+), 1 deletion(-) ---------------------------------------------------------------------- http://git-wip-us.apache.org/repos/asf/spark/blob/7696b9de/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala ---------------------------------------------------------------------- diff --git a/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala b/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala index f001f16..32267eb 100644 --- a/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala +++ b/sql/core/src/main/scala/org/apache/spark/sql/Dataset.scala @@ -1617,7 +1617,9 @@ class Dataset[T] private[sql]( */ @Experimental @InterfaceStability.Evolving - def reduce(func: (T, T) => T): T = rdd.reduce(func) + def reduce(func: (T, T) => T): T = withNewRDDExecutionId { + rdd.reduce(func) + } /** * :: Experimental :: --------------------------------------------------------------------- To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org For additional commands, e-mail: commits-h...@spark.apache.org