I need to sort my RDD partitions but the whole partition(s) might not fit
into memory, so I cannot run the Collections Sort() method. Does Spark
support partitions sorting by virtue of its framework? I am working on 1.1.0
version.

I looked up similar unanswered question:

/http://apache-spark-user-list.1001560.n3.nabble.com/sort-order-after-reduceByKey-groupByKey-td2959.html/

Thanks All!!



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Partition-sorting-by-Spark-framework-tp18213.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to