Should a new job be setup under Spark-Master-Maven-with-YARN for hadoop
2.6.x ?

Cheers

On Thu, Nov 19, 2015 at 5:16 PM, 张志强(旺轩) <[email protected]> wrote:

> I agreed
> +1
>
> ------------------------------------------------------------------
> 发件人:Reynold Xin<[email protected]>
> 日 期:2015年11月20日 06:14:44
> 收件人:[email protected]<[email protected]>; Sean Owen<[email protected]>;
> Thomas Graves<[email protected]>
> 主 题:Dropping support for earlier Hadoop versions in Spark 2.0?
>
>
> I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I
> think everybody is for that.
>
> https://issues.apache.org/jira/browse/SPARK-11807
>
> Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is
> to say, keep only Hadoop 2.6 and greater.
>
> What are the community's thoughts on that?
>
>
>

Reply via email to