[ https://issues.apache.org/jira/browse/SPARK-26757?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26757: ------------------------------------ Assignee: Apache Spark > GraphX EdgeRDDImpl and VertexRDDImpl `count` method cannot handle empty RDDs > ---------------------------------------------------------------------------- > > Key: SPARK-26757 > URL: https://issues.apache.org/jira/browse/SPARK-26757 > Project: Spark > Issue Type: Bug > Components: GraphX > Affects Versions: 2.3.1, 2.3.2, 2.4.0 > Reporter: Huon Wilson > Assignee: Apache Spark > Priority: Minor > > The {{EdgeRDDImpl}} and {{VertexRDDImpl}} types provided by {{GraphX}} throw > an {{java.lang.UnsupportedOperationException: empty collection}} exception if > {{count}} is called on an empty instance, when they should return 0. > {code:scala} > import org.apache.spark.graphx.{Graph, Edge} > val graph = Graph.fromEdges(sc.emptyRDD[Edge[Unit]], 0) > graph.vertices.count > graph.edges.count > {code} > Running that code in a spark-shell: > {code:none} > scala> import org.apache.spark.graphx.{Graph, Edge} > import org.apache.spark.graphx.{Graph, Edge} > scala> val graph = Graph.fromEdges(sc.emptyRDD[Edge[Unit]], 0) > graph: org.apache.spark.graphx.Graph[Int,Unit] = > org.apache.spark.graphx.impl.GraphImpl@6879e983 > scala> graph.vertices.count > java.lang.UnsupportedOperationException: empty collection > at > org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$apply$36.apply(RDD.scala:1031) > at > org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$apply$36.apply(RDD.scala:1031) > at scala.Option.getOrElse(Option.scala:121) > at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1031) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) > at org.apache.spark.rdd.RDD.reduce(RDD.scala:1011) > at org.apache.spark.graphx.impl.VertexRDDImpl.count(VertexRDDImpl.scala:90) > ... 49 elided > scala> graph.edges.count > java.lang.UnsupportedOperationException: empty collection > at > org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$apply$36.apply(RDD.scala:1031) > at > org.apache.spark.rdd.RDD$$anonfun$reduce$1$$anonfun$apply$36.apply(RDD.scala:1031) > at scala.Option.getOrElse(Option.scala:121) > at org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1031) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) > at > org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) > at org.apache.spark.rdd.RDD.withScope(RDD.scala:363) > at org.apache.spark.rdd.RDD.reduce(RDD.scala:1011) > at org.apache.spark.graphx.impl.EdgeRDDImpl.count(EdgeRDDImpl.scala:90) > ... 49 elided > {code} -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org