Assuming we have 1.6 and 1.7 releases, then spark 2.0 is about 9 months away. 

customer will need to upgrade the new Hadoop clusters to Apache 2.6 or later to 
leverage new spark 2.0 in one year. I think this possible as latest release on 
cdh5.x,  HDP 2.x are both on Apache 2.6.0 already. Company will have enough 
time to upgrade cluster.

+1 for me as well

Chester




Sent from my iPad

> On Nov 19, 2015, at 2:14 PM, Reynold Xin <r...@databricks.com> wrote:
> 
> I proposed dropping support for Hadoop 1.x in the Spark 2.0 email, and I 
> think everybody is for that.
> 
> https://issues.apache.org/jira/browse/SPARK-11807
> 
> Sean suggested also dropping support for Hadoop 2.2, 2.3, and 2.4. That is to 
> say, keep only Hadoop 2.6 and greater.
> 
> What are the community's thoughts on that?
> 

Reply via email to