Hi,
<http://stackoverflow.com/questions/43038682/secondary-sort-using-apache-spark-1-6#>

I am referring web link http://codingjunkie.net/spark-secondary-sort/ to
implement secondary sort in my spark job.

I have defined my key case class as

case class DeviceKey(serialNum: String, eventDate: String, EventTs: Long) {
      implicit def orderingBySerialNum[A <: DeviceKey] : Ordering[A] = {
       Ordering.by(fk => (fk.serialNum, fk.eventDate, fk.EventTs * -1))
    }
}

but when I try to apply function
t.repartitionAndSortWithinPartitions(partitioner)

#t is a RDD[(DeviceKey, Int)]

I get error
I am getting error as -
value repartitionAndSortWithinPartitions is not a member of
org.apache.spark.rdd.RDD[(DeviceKey, Int)]


Example code available at
http://stackoverflow.com/questions/43038682/secondary-sort-using-apache-spark-1-6

Could somebody help me to understand error.

Many Thanks

Pari


-- 
Cheers,
Pari

Reply via email to