(The build error indicates you have some old class files somewhere -- "clean" first)
Here, the lib/ directory definitely has the right dependencies and it still doesn't work. Benson investigated and found out it's just how Hadoop works in this case. On Mon, May 9, 2011 at 12:06 AM, Ken Krugler <[email protected]> wrote: > I haven't been actively running Mahout for a while, but I do watch plenty of > Hadoop students run into the ClassNotFoundException problem. > > A standard Hadoop job jar has a lib subdir, which contains (as jars) all of > the dependencies. > > Typically the missing class problem is caused by somebody building their own > Hadoop job jar, where they don't include a dependent jar (such as > mahout-math) in the lib subdir. > > Or somebody is trying to run a job locally, using the job jar directly, which > then has to be unpacked as otherwise these embedded lib/*.jar classes aren't > on the classpath. > > But neither of those seem to match what Jake was doing: > >> (just running things like "./bin/mahout svd -i <input> -o <output> etc... ") > > > I was going to try this out from trunk, but an svn up on trunk and then "mvn > install" failed to pass one of the tests: > >> Tests run: 2, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 0.025 sec >> <<< FAILURE! >> fullRankTall(org.apache.mahout.math.QRDecompositionTest) Time elapsed: >> 0.014 sec <<< ERROR! >> java.lang.NoSuchFieldError: MAX >> at >> org.apache.mahout.math.QRDecompositionTest.assertEquals(QRDecompositionTest.java:122) >> at >> org.apache.mahout.math.QRDecompositionTest.fullRankTall(QRDecompositionTest.java:38) > > > -- Ken
