I don't think it's necessary. You're looking at the hadoop-2.4
profile, which works with anything >= 2.4. AFAIK there is no further
specialization needed beyond that. The profile sets hadoop.version to
2.4.0 by default, but this can be overridden.

On Fri, Nov 14, 2014 at 3:43 PM, Corey Nolet <cjno...@gmail.com> wrote:
> I noticed Spark 1.2.0-SNAPSHOT still has 2.4.x in the pom. Since 2.5.x is
> the current stable Hadoop 2.x, would it make sense for us to update the
> poms?

---------------------------------------------------------------------
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Reply via email to