In scala, we can make two Rdd using the same partitioner so that they are co-partitioned val partitioner = new HashPartitioner(5) val a1 = a.partitionBy(partitioner).cache() val b1 = b.partiitonBy(partitioner).cache()
How can we achieve the same in python? It would be great if somebody can share some examples. Thanks, Xiang -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/make-two-rdd-co-partitioned-in-python-tp22445.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org