+1 for something like that. Again, spark just makes tons of binary releases
bound to a specific flavor of H-1 or H-2 including CDH etc.

Not sure if it is totally feasible with just build techniques (the
ubiquitous #ifdef macros immediately spring up in mind, something i am
totally not missing in java) but if it is, it is the way to go.



On Fri, May 23, 2014 at 6:49 AM, Gokhan Capan <gkhn...@gmail.com> wrote:

> My vote would be releasing mahout with hadoop1 and hadoop2 classifiers
>
> Gokhan
>
>
> On Fri, May 23, 2014 at 4:43 PM, Sebastian Schelter <
> ssc.o...@googlemail.com
> > wrote:
>
> > Big +1
> > Am 23.05.2014 15:33 schrieb "Ted Dunning" <ted.dunn...@gmail.com>:
> >
> > > What do folks think about spinning out a new version of 0.9 that only
> > > changes which version of Hadoop the build uses?
> > >
> > > There have been quite a few questions lately on this topic.
> > >
> > > My suggestion would be that we use minor version numbering to maintain
> > this
> > > and the normal 0.9 release simultaneously if we decide to do a bug fix
> > > release.
> > >
> > > Any thoughts?
> > >
> >
>

Reply via email to