+1 .  going down this approach right now.  Will update the mahout PR send
out a link when ready for review.

On Mon, Jul 17, 2017 at 1:21 PM, Trevor Grant <trevor.d.gr...@gmail.com>
wrote:

> What if we remove the hadoop2 profile, making all of its setting just hard
> coded default (it existed at the time bc Hadoop 1/Hadoop 2, but we haven't
> supported hadoop 1 for a while.
>
> Then, override those values in the Spark 2.2. profile with Hadoop2.6, and
> specify Java8 with a plugin so it will fail on the build if compiled with
> java7
>
> My thought.
>
>
>
> On Mon, Jul 17, 2017 at 11:02 AM, dustin vanstee <dustinvans...@gmail.com>
> wrote:
>
> > Hi, Trevor and I were able to get the latest version of Mahout to compile
> > with Spark 2.2.  The main tweaks being that Spark 2.2 requires java8 and
> > hadoop2.6 or greater.   This issue is that we have a hadoop2 profile that
> > sets up hadoop.version= 2.4.1, and loads dependencies and this is only
> > compatible with Spark 2.1 and below.   I would like to propose removing
> the
> > hadoop2 profile, and just baking in the hadoop version and dependencies
> > within each spark profile.  I wanted to run that by the community before
> I
> > went to far with it and get some feedback if there would be a better
> > alternative.  Trevor can you weigh in if I missed something?
> >
>

Reply via email to