I don't like profiles - they complicate things (imagine what release
process would look like, with proper versioning and tagging), and profiles
are not as transparent as other options.
I prefer using assembly per classifier, or having separate (sub)modules
with different dependencies. Then a
What do folks think about spinning out a new version of 0.9 that only
changes which version of Hadoop the build uses?
There have been quite a few questions lately on this topic.
My suggestion would be that we use minor version numbering to maintain this
and the normal 0.9 release simultaneously
Big +1
Am 23.05.2014 15:33 schrieb Ted Dunning ted.dunn...@gmail.com:
What do folks think about spinning out a new version of 0.9 that only
changes which version of Hadoop the build uses?
There have been quite a few questions lately on this topic.
My suggestion would be that we use minor
My vote would be releasing mahout with hadoop1 and hadoop2 classifiers
Gokhan
On Fri, May 23, 2014 at 4:43 PM, Sebastian Schelter ssc.o...@googlemail.com
wrote:
Big +1
Am 23.05.2014 15:33 schrieb Ted Dunning ted.dunn...@gmail.com:
What do folks think about spinning out a new version of
Gokhan,
Your suggestion is far superior to what I had in mind.
Let's pretend that yours is the real suggestion.
On Fri, May 23, 2014 at 6:49 AM, Gokhan Capan gkhn...@gmail.com wrote:
My vote would be releasing mahout with hadoop1 and hadoop2 classifiers
Gokhan
On Fri, May 23, 2014 at
+1 for something like that. Again, spark just makes tons of binary releases
bound to a specific flavor of H-1 or H-2 including CDH etc.
Not sure if it is totally feasible with just build techniques (the
ubiquitous #ifdef macros immediately spring up in mind, something i am
totally not missing in
Regarding mechanics, the fact that we have profiles available to do the
build already should make the process very simple ... roughly just adding
-Phadoop2 or some such. Internally, it is setting a few symbols and
tweaking the dependencies slightly.
On Fri, May 23, 2014 at 10:21 AM, Dmitriy
I updated patch to M-1329 with Gokhan and Suneel recommendation in
comments.
And can anybody test new patch?
On Wed, Feb 19, 2014 at 6:26 PM, Suneel Marthi suneel_mar...@yahoo.comwrote:
Thanks Sergey. This needs some fixing it takes about 5 mins on my machine
too for a small Reuters dataset.
Today I updated patch in M-1329 to trunk. It's ticket that add support
hadoop2 to mahout.
I builded mahout with patch and all UT was passed for hadoop1 and hadoop2.
Also I tested examples/bin on the both hadoop version.
Can somebody from committers review patch and test it?
Thanks,
Sergey!
--
Sergey I think it already worked with 2.0, no? (Although it doesn't
actually use the 2.x APIs). Is this for 2.2 and/or what are the
high-level changes? I'd imagine mostly packaging stuff.
On Wed, Feb 19, 2014 at 2:14 PM, Sergey Svinarchuk
ssvinarc...@hortonworks.com wrote:
Today I updated patch
Hmm I thought there was already a profile for this, but on second
look, I only see a settable hadoop.version. It has both hadoop-core
and hadoop-common dependencies which isn't right. I bet this patch
clarifies the difference properly, and that's got to be good.
I think I am thinking of how the
Thanks for the patch Sergey. I tested this with Hadoop 1 and 2 and can confirm
that all unit tests pass and the examples work.
On Wednesday, February 19, 2014 9:39 AM, Sean Owen sro...@gmail.com wrote:
Hmm I thought there was already a profile for this, but on second
look, I only see a
Thanks!
This patch will be added in mahout 1.0?
On Wed, Feb 19, 2014 at 5:39 PM, Suneel Marthi suneel_mar...@yahoo.comwrote:
Thanks for the patch Sergey. I tested this with Hadoop 1 and 2 and can
confirm that all unit tests pass and the examples work.
On Wednesday, February 19, 2014
Yes
On Wednesday, February 19, 2014 10:43 AM, Sergey Svinarchuk
ssvinarc...@hortonworks.com wrote:
Thanks!
This patch will be added in mahout 1.0?
On Wed, Feb 19, 2014 at 5:39 PM, Suneel Marthi suneel_mar...@yahoo.comwrote:
Thanks for the patch Sergey. I tested this with Hadoop 1 and
14 matches
Mail list logo