Repository: spark
Updated Branches:
  refs/heads/master 08a7a836c -> 404a28f4e


[SPARK-11112] Fix Scala 2.11 compilation error in RDDInfo.scala

As shown in 
https://amplab.cs.berkeley.edu/jenkins/view/Spark-QA-Compile/job/Spark-Master-Scala211-Compile/1946/console
 , compilation fails with:
```
[error] 
/home/jenkins/workspace/Spark-Master-Scala211-Compile/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala:25:
 in class RDDInfo, multiple overloaded alternatives of constructor RDDInfo 
define default arguments.
[error] class RDDInfo(
[error]
```
This PR tries to fix the compilation error

Author: tedyu <yuzhih...@gmail.com>

Closes #9538 from tedyu/master.


Project: http://git-wip-us.apache.org/repos/asf/spark/repo
Commit: http://git-wip-us.apache.org/repos/asf/spark/commit/404a28f4
Tree: http://git-wip-us.apache.org/repos/asf/spark/tree/404a28f4
Diff: http://git-wip-us.apache.org/repos/asf/spark/diff/404a28f4

Branch: refs/heads/master
Commit: 404a28f4edd09cf17361dcbd770e4cafde51bf6d
Parents: 08a7a83
Author: tedyu <yuzhih...@gmail.com>
Authored: Mon Nov 9 10:07:58 2015 -0800
Committer: Andrew Or <and...@databricks.com>
Committed: Mon Nov 9 10:07:58 2015 -0800

----------------------------------------------------------------------
 .../main/scala/org/apache/spark/storage/RDDInfo.scala   | 12 +-----------
 1 file changed, 1 insertion(+), 11 deletions(-)
----------------------------------------------------------------------


http://git-wip-us.apache.org/repos/asf/spark/blob/404a28f4/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala
----------------------------------------------------------------------
diff --git a/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala 
b/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala
index 3fa209b..87c1b98 100644
--- a/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala
+++ b/core/src/main/scala/org/apache/spark/storage/RDDInfo.scala
@@ -28,20 +28,10 @@ class RDDInfo(
     val numPartitions: Int,
     var storageLevel: StorageLevel,
     val parentIds: Seq[Int],
-    val callSite: CallSite,
+    val callSite: CallSite = CallSite.empty,
     val scope: Option[RDDOperationScope] = None)
   extends Ordered[RDDInfo] {
 
-  def this(
-      id: Int,
-      name: String,
-      numPartitions: Int,
-      storageLevel: StorageLevel,
-      parentIds: Seq[Int],
-      scope: Option[RDDOperationScope] = None) {
-    this(id, name, numPartitions, storageLevel, parentIds, CallSite.empty, 
scope)
-  }
-
   var numCachedPartitions = 0
   var memSize = 0L
   var diskSize = 0L


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@spark.apache.org
For additional commands, e-mail: commits-h...@spark.apache.org

Reply via email to