Where did you look?

BTW, it is defined in the RDD class as a val:

val  partitioner: Option[Partitioner] 


Mohammed

-----Original Message-----
From: Darin McBeath [mailto:ddmcbe...@yahoo.com.INVALID] 
Sent: Tuesday, February 17, 2015 1:45 PM
To: User
Subject: How do you get the partitioner for an RDD in Java?

In an 'early release' of the Learning Spark book, there is the following 
reference:

In Scala and Java, you can determine how an RDD is partitioned using its 
partitioner property (or partitioner() method in Java)

However, I don't see the mentioned 'partitioner()' method in Spark 1.2 or a way 
of getting this information.

I'm curious if anyone has any suggestions for how I might go about finding how 
an RDD is partitioned in a Java program.

Thanks.

Darin.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional 
commands, e-mail: user-h...@spark.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to