I only ask because I'm not sure if someone was already working on it, but I'll file a ticket and see if I start porting some of it over and make some patches.
Mahout's on Hadoop 0.20 and commons-math-2.0 by now, IIRC? That's what decomposer is using. -jake On Fri, Sep 25, 2009 at 12:10 PM, Robin Anil <[email protected]> wrote: > You dont have to ask. Please go ahead file a JIRA issue, and start working > on it. > http://issues.apache.org/jira/browse/MAHOUT > > Robin > > > On Sat, Sep 26, 2009 at 12:33 AM, Jake Mannix <[email protected]> > wrote: > > > Those look very cool, I'd love to see how those compare with doing > > plain-old Lanczos for SVD on Hadoop. Speaking of which, I've got an > > implementation of that which I wrote up for my own matrix library ( > > http://decomposer.googlecode.com ) a while back, and I noticed that we > > still > > don't have any large-scale SVD impls in Mahout. Is there any interest by > > the community for me to try and port that / contribute this to Mahout? > > It's > > Apache-licensed, but I'm currently using mostly my own sparse and dense > > vector writables for use on Hadoop (designed specifically for things like > > Lanczos and AGHA), so I'd need to port them over to use whichever vector > > impls Mahout is using. > > > > -jake > > > > On Fri, Sep 25, 2009 at 11:25 AM, Ted Dunning <[email protected]> > > wrote: > > > > > Isabel, > > > > > > Very interesting post. Here are more accessible resources: > > > > > > http://arxiv.org/abs/0909.4061 > > > http://www.pnas.org/content/104/51/20167 > > > > > > THese provide a very interesting and solid link between random indexing > > and > > > SVD algorithms. They also definitely provide a fantastic way to > > implement > > > large scale SVD using map-reduce. > > > > > > Nice pointer! > > > > > > 2009/9/25 Michael Brückner <[email protected]> > > > > > > > > > > > year's NIPS ( > http://nips.cc/Conferences/2009/Program/event.php?ID=1491 > > ) > > > > > > > > > > > > > > > -- > > > Ted Dunning, CTO > > > DeepDyve > > > > > >
