Sending the response back to the dev list so this is indexable and
searchable by others.
-- Forwarded message --
From: Milos Nikolic
Date: Sat, Aug 30, 2014 at 5:50 PM
Subject: Re: Partitioning strategy changed in Spark 1.0.x?
To: Reynold Xin
Thank you, your insights were very
Hi guys,
I’ve noticed some changes in the behavior of partitioning under Spark 1.0.x.
I’d appreciate if someone could explain what has changed in the meantime.
Here is a small example. I want to create two RDD[(K, V)] objects and then
collocate partitions with the same K on one node. When the