Re: Workaround for SPARK-1931 not compiling

2014-10-24 Thread Ankur Dave
At 2014-10-23 09:48:55 +0530, Arpit Kumar arp8...@gmail.com wrote: error: value partitionBy is not a member of org.apache.spark.rdd.RDD[(org.apache.spark.graphx.PartitionID, org.apache.spark.graphx.Edge[ED])] Since partitionBy is a member of PairRDDFunctions, it sounds like the implicit

Re: Workaround for SPARK-1931 not compiling

2014-10-24 Thread Arpit Kumar
Thanks a lot. Now it is working properly. On Sat, Oct 25, 2014 at 2:13 AM, Ankur Dave ankurd...@gmail.com wrote: At 2014-10-23 09:48:55 +0530, Arpit Kumar arp8...@gmail.com wrote: error: value partitionBy is not a member of org.apache.spark.rdd.RDD[(org.apache.spark.graphx.PartitionID,

Workaround for SPARK-1931 not compiling

2014-10-22 Thread Arpit Kumar
Hi all, I am new to spark/graphx and am trying to use partitioning strategies in graphx on spark 1.0.0 The workaround I saw on the main page seems not to compile. The code I added was def partitionBy[ED](edges: RDD[Edge[ED]], partitionStrategy: PartitionStrategy): RDD[Edge[ED]] = { val