Repository: spark
Updated Branches:
  refs/heads/master e678b9f02 -> c35186206


[SPARK-3935][Core] log the number of records that has been written

There is a unused variable(count) in saveAsHadoopDataset in 
PairRDDFunctions.scala. The initial idea of this variable seems to count the 
number of records, so I am adding a log statement to log the number of records 
that has been written to the writer.

Author: likun <jacky.li...@huawei.com>
Author: jackylk <jacky.li...@huawei.com>

Closes #2791 from jackylk/SPARK-3935 and squashes the following commits:

a874047 [jackylk] removing the unused variable in PairRddFunctions.scala
3bf43c7 [likun] log the number of records has been written


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/c3518620
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/c3518620
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/c3518620

Branch: refs/heads/master
Commit: c351862064ed7d2031ea4c8bf33881e5f702ea0a
Parents: e678b9f
Author: likun <jacky.li...@huawei.com>
Authored: Fri Oct 17 10:33:45 2014 -0700
Committer: Andrew Or <andrewo...@gmail.com>
Committed: Fri Oct 17 10:33:45 2014 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala | 2 --
 1 file changed, 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/c3518620/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala 
b/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
index 929ded5..ac96de8 100644
--- a/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
+++ b/core/src/main/scala/org/apache/spark/rdd/PairRDDFunctions.scala
@@ -1032,10 +1032,8 @@ class PairRDDFunctions[K, V](self: RDD[(K, V)])
       writer.setup(context.stageId, context.partitionId, attemptNumber)
       writer.open()
       try {
-        var count = 0
         while (iter.hasNext) {
           val record = iter.next()
-          count += 1
           writer.write(record._1.asInstanceOf[AnyRef], 
record._2.asInstanceOf[AnyRef])
         }
       } finally {


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to