Hi,

I aim to do custom partitioning on a text file. I first convert it into
pairRDD and then try to use my custom partitioner. However, somehow it is
not working. My code snippet is given below.

val file=sc.textFile(filePath)
val locLines=file.map(line => line.split("\t")).map(line=>
((line(2).toDouble,line(3).toDouble),line(5).toLong))
val ck=locLines.partitionBy(new HashPartitioner(50)) // new
CustomPartitioner(50) -- none of the way is working here.

while reading the file using "textFile" method it automatically partitions
the file. However when i explicitly want to partition the new rdd
"locLines", It doesn't appear to do anything and even the number of
partitions are same which is created by sc.textFile().

Any help in this regard will be highly appreciated.




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Custom-Partitioning-Spark-tp22571.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to