Subject: RE: What does Spark is not just MapReduce mean? Isn't every Spark
job a form of MapReduce?
Spark comes with quite a few components. At it's core is..surprisespark
core. This provides the core things required to run spark jobs. Spark provides
a lot of operators out of the box...take a look
spark is partitioner aware, so it can exploit a situation where 2 datasets
are partitioned the same way (for example by doing a map-side join on
them). map-red does not expose this.
On Sun, Jun 28, 2015 at 12:13 PM, YaoPau jonrgr...@gmail.com wrote:
I've heard Spark is not just MapReduce
Vanilla map/reduce does not expose it: but hive on top of map/reduce has
superior partitioning (and bucketing) support to Spark.
2015-06-28 13:44 GMT-07:00 Koert Kuipers ko...@tresata.com:
spark is partitioner aware, so it can exploit a situation where 2 datasets
are partitioned the same way
Spark comes with quite a few components. At it's core is..surprisespark
core. This provides the core things required to run spark jobs. Spark provides
a lot of operators out of the box...take a look at
: 06/28/2015 10:51 AM (GMT-07:00)
To: YaoPau jonrgr...@gmail.com,Apache Spark user@spark.apache.org
Subject: RE: What does Spark is not just MapReduce mean? Isn't every Spark
job a form of MapReduce?
Spark comes with quite a few components. At it's core is..surprisespark
core