I'm personally in favor, but I don't have a sense of how many people still
rely on Hadoop 1.

Nick

2015년 6월 12일 (금) 오전 9:13, Steve Loughran
ste...@hortonworks.com>님이 작성:

+1 for 2.2+
>
> Not only are the APis in Hadoop 2 better, there's more people testing
> Hadoop 2.x & spark, and bugs in Hadoop itself being fixed.
>
> (usual disclaimers, I work off branch-2.7 snapshots I build nightly, etc)
>
> > On 12 Jun 2015, at 11:09, Sean Owen <so...@cloudera.com> wrote:
> >
> > How does the idea of removing support for Hadoop 1.x for Spark 1.5
> > strike everyone? Really, I mean, Hadoop < 2.2, as 2.2 seems to me more
> > consistent with the modern 2.x line than 2.1 or 2.0.
> >
> > The arguments against are simply, well, someone out there might be
> > using these versions.
> >
> > The arguments for are just simplification -- fewer gotchas in trying
> > to keep supporting older Hadoop, of which we've seen several lately.
> > We get to chop out a little bit of shim code and update to use some
> > non-deprecated APIs. Along with removing support for Java 6, it might
> > be a reasonable time to also draw a line under older Hadoop too.
> >
> > I'm just gauging feeling now: for, against, indifferent?
> > I favor it, but would not push hard on it if there are objections.
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
>
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to