Repository: spark
Updated Branches:
  refs/heads/master 9baf09301 -> fe08561e2


[SPARK-8476] [CORE] Setters inc/decDiskBytesSpilled in TaskMetrics should also 
be private.

This is a follow-up of 
[SPARK-3288](https://issues.apache.org/jira/browse/SPARK-3288).

Author: Takuya UESHIN <ues...@happy-camper.st>

Closes #6896 from ueshin/issues/SPARK-8476 and squashes the following commits:

89251d8 [Takuya UESHIN] Make inc/decDiskBytesSpilled in TaskMetrics 
private[spark].


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/fe08561e
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/fe08561e
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/fe08561e

Branch: refs/heads/master
Commit: fe08561e2ee13fc8f641db8b6e6c1499bdfd4d29
Parents: 9baf093
Author: Takuya UESHIN <ues...@happy-camper.st>
Authored: Fri Jun 19 10:48:16 2015 -0700
Committer: Reynold Xin <r...@databricks.com>
Committed: Fri Jun 19 10:48:16 2015 -0700

----------------------------------------------------------------------
 core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala | 4 ++--
 1 file changed, 2 insertions(+), 2 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/fe08561e/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala 
b/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala
index 38b61d7..a3b4561 100644
--- a/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala
+++ b/core/src/main/scala/org/apache/spark/executor/TaskMetrics.scala
@@ -94,8 +94,8 @@ class TaskMetrics extends Serializable {
    */
   private var _diskBytesSpilled: Long = _
   def diskBytesSpilled: Long = _diskBytesSpilled
-  def incDiskBytesSpilled(value: Long): Unit = _diskBytesSpilled += value
-  def decDiskBytesSpilled(value: Long): Unit = _diskBytesSpilled -= value
+  private[spark] def incDiskBytesSpilled(value: Long): Unit = 
_diskBytesSpilled += value
+  private[spark] def decDiskBytesSpilled(value: Long): Unit = 
_diskBytesSpilled -= value
 
   /**
    * If this task reads from a HadoopRDD or from persisted data, metrics on 
how much data was read


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to