+1.

Hadoop 2.6 would be a good choice with many features added (like supporting
long running service, label based scheduling). Currently there's lot of
reflection codes to support multiple version of Yarn, so upgrading to a
newer version will really ease the pain :).

Thanks
Saisai

On Fri, Nov 20, 2015 at 3:58 PM, Jean-Baptiste Onofré <j...@nanthrax.net>
wrote:

> +1
>
> Regards
> JB
>
>
> On 11/19/2015 11:14 PM, Reynold Xin wrote:
>
>> I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I
>> think everybody is for that.
>>
>> https://issues.apache.org/jira/browse/SPARK-11807
>>
>> Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That
>> is to say, keep only Hadoop 2.6 and greater.
>>
>> What are the community's thoughts on that?
>>
>>
> --
> Jean-Baptiste Onofré
> jbono...@apache.org
> http://blog.nanthrax.net
> Talend - http://www.talend.com
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> For additional commands, e-mail: dev-h...@spark.apache.org
>
>

Reply via email to