Hi Ashok,

In my opinion, we should look at Hadoop as a general purpose Framework that
supports multiple models and we should look at Spark as an alternative to
Hadoop MapReduce rather than a replacement to Hadoop ecosystem (for
instance, Spark is not replacing Zookeper, HDFS, etc)

Regards

On Thu, Apr 14, 2016 at 4:22 PM, Andy Davidson <
a...@santacruzintegration.com> wrote:

> Hi Ashok
>
> In general if I was starting a new project and had not invested heavily in
> hadoop (i.e. Had a large staff that was trained on hadoop, had a lot of
> existing projects implemented on hadoop, …) I would probably start using
> spark. Its faster and easier to use
>
> Your mileage may vary
>
> Andy
>
> From: Ashok Kumar <ashok34...@yahoo.com.INVALID>
> Reply-To: Ashok Kumar <ashok34...@yahoo.com>
> Date: Thursday, April 14, 2016 at 12:13 PM
> To: "user @spark" <user@spark.apache.org>
> Subject: Spark replacing Hadoop
>
> Hi,
>
> I hear that some saying that Hadoop is getting old and out of date and
> will be replaced by Spark!
>
> Does this make sense and if so how accurate is it?
>
> Best
>
>

Reply via email to