[ https://issues.apache.org/jira/browse/SPARK-4335?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Reynold Xin updated SPARK-4335: ------------------------------- Priority: Blocker (was: Major) > Mima check misreporting for GraphX pull request > ----------------------------------------------- > > Key: SPARK-4335 > URL: https://issues.apache.org/jira/browse/SPARK-4335 > Project: Spark > Issue Type: Bug > Components: Build > Affects Versions: 1.2.0 > Reporter: Reynold Xin > Priority: Blocker > > Mima is reporting binary check failure for RDD/partitions in the following > pull request even though the pull request doesn't change those at all. > https://github.com/apache/spark/pull/2530 > {code} > [error] * abstract method getPartitions()Array[org.apache.spark.Partition] > in class org.apache.spark.rdd.RDD does not have a correspondent in new version > [error] filter with: > ProblemFilters.exclude[AbstractMethodProblem]("org.apache.spark.rdd.RDD.getPartitions") > [error] * abstract method > compute(org.apache.spark.Partition,org.apache.spark.TaskContext)scala.collection.Iterator > in class org.apache.spark.rdd.RDD does not have a correspondent in new > version > [error] filter with: > ProblemFilters.exclude[AbstractMethodProblem]("org.apache.spark.rdd.RDD.compute") > [error] * abstract method getPartitions()Array[org.apache.spark.Partition] > in class org.apache.spark.rdd.RDD does not have a correspondent in new version > [error] filter with: > ProblemFilters.exclude[AbstractMethodProblem]("org.apache.spark.rdd.RDD.getPartitions") > [error] * abstract method > compute(org.apache.spark.Partition,org.apache.spark.TaskContext)scala.collection.Iterator > in class org.apache.spark.rdd.RDD does not have a correspondent in new > version > [error] filter with: > ProblemFilters.exclude[AbstractMethodProblem]("org.apache.spark.rdd.RDD.compute") > [info] Done updating. > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org