Not sure what's going on there. Your build is somehow internally
inconsistent at runtime since it fails on the client side due to
mismatching Hadoop versions. I can start making up increasingly unlikely
causes, like you are building against local copies of the artifacts that
are old? Running clean
The build was successful with those "warning".
I could not run a test case from this website:
http://girlincomputerscience.blogspot.com.au/2010/11/apache-mahout.html
Here are my run with the error:
+
Hm, OK something sounds wrong with your directory structure, given the
warnings. I assumed this was changed. It could be that the .tar.gz
distribution isn't quite correctly set up for building from source.
The compilation here is nothing to do with Hadoop. You show a
successful build; what's the p
Where did I modifying the build?
Here are my steps of the build.
I got the source from one of the official mirror website and build it.
The only one exception here is that I am using the Cloudera CDH 5.0.
This latest CDHv5.0 might not work with Mahout v0.9.
This may be getting to you're-on-your-own-territory since you're
modifying the build. This error means your directory structure doesn't
match up with declarations. You said somewhere that the parent of
module X was Y, but the location given points to the pom of a module
that isn't Y.
On Wed, Apr 2
Hi Sean,
I am trying to build the Mahout again and got some WARNINGs so far.
Can you give me some hints what I have done wrong here?
Thanks for your help so far.
++
$ mv mahout-distribution-0.9 mahout-distribution-0.9.old
$ tar xvf
This still means Hadoop version mismatch.
You shouldn't have to use CLASSPATH. To be safe use the .job file with
all the dependencies including Hadoop client baked in.
I am guessing that you are picking up other versions of Mahout which
are not compiled for Hadoop 2, in that CLASSPATH? Kind of a
Hi Sean,
Thanks for reviewing my issue.
Actually, the directory (/usr/lib/hadoop-0.20-mapreduce) contain Cloudera v5b1
which I believe it has been upgraded to Hadoop 2.2.
I also have another Hadoop home directory (/usr/lib/hadoop-mapreduce/) which is
also belong to Cloudera v5b1.
Please see belo
Hello,
I am new to Mahout. I have installed the Mahout-0.9.
I have configured a hadoop(1.0.3)) on my laptop (Redhat 6, Lenovo W530). I
am experimenting the k-means test ( by running
mahout-distribution-0.9/examples/bin/cluster-reuters.sh)
I am able to run the k-means test out of box on hadoop
Hello,
Running seqdirecotry (from Mahout 0.9) on a large input file gives an exception
which is shown as below. Any idea?
MAHOUT_LOCAL is set, running locally
14/04/01 12:15:17 INFO common.AbstractJob: Command line arguments:
{--charset=[UTF-8], --chunkSize=[64], --endPhase=[2147483647],
--file
10 matches
Mail list logo